Jeff Harman in on fireย - pun intendedย in this talk '๐๐ช๐ฆ ๐ฟ๐ฒ:๐๐ป๐๐ฒ๐ป๐ ๐ฎ๐ฌ๐ฎ๐ฑ - ๐๐ฑ๐ง๐ฒ๐ฐ๐ต
Jeff Harman in on fireย - pun intendedย in this talk “๐๐ช๐ฆ ๐ฟ๐ฒ:๐๐ป๐๐ฒ๐ป๐ ๐ฎ๐ฌ๐ฎ๐ฑ - ๐๐ฑ๐ง๐ฒ๐ฐ๐ต ๐๐ป๐ป๐ผ๐๐ฎ๐๐ถ๐ผ๐ป ๐๐ถ๐๐ต ๐๐-๐๐ฟ๐ถ๐๐ฒ๐ป ๐๐ฒ๐๐ฒ๐น๐ผ๐ฝ๐บ๐ฒ๐ป๐ ๐ณ๐ผ๐ฟ ๐๐ฟ๐ฎ๐ป๐ฑ ๐๐ด๐ฒ๐ป๐๐ (๐๐ก๐๐ฏ๐ฏ๐ฏ๐ฐ)” [1]ย while explaining how AdTech teams can use AWSโs AI-Driven Development Lifecycle (AI-DLC) to build production-grade โbrand agentsโ for advertisers in about five days instead of months.[1]
Super interesting talk for everyone in advertising to get a better understanding of not only why agents will impact your business, but also build those efficiently.
For everyone else as Jeff is introducing an AI Specific Software Developments Life Cycle. His talk just scratches the surface on this one, but it’s super interesting! Watch the space here, as I’m going to share more about this while I’m diving deeper on this.
But now back to Jeff’s talk …
๐ก ๐๐ผ๐ฟ๐ฒ ๐ถ๐ฑ๐ฒ๐ฎ
Jeff introduces AI-DLC, a methodology where AI is treated as a development partner across discovery, requirements, design, coding, testing, and deployment, not just as a coding assistant. By encoding environment details (CI/CD, languages, allowed services) and business vision into structured text โsteeringโ artifacts, teams can keep rich context with the AI without constantly exceeding context limits.
โฉ ๐๐ถ๐๐ฒโ๐ฑ๐ฎ๐ ๐๐-๐๐๐ ๐ณ๐น๐ผ๐
The process starts with discovery, then moves to requirements analysis where AI helps transform the product vision into detailed user stories and domain models that are small enough to solve as coding tasks.AI is then generating code, test harnesses, and CI/CD pipelines, with humans focused on validation, architecture choices, and production constraints, enabling production-ready code in around five days plus a short hardening phase.
๐ค ๐๐ฟ๐ฎ๐ป๐ฑ ๐ฎ๐ด๐ฒ๐ป๐๐ ๐ณ๐ผ๐ฟ ๐๐ฑ๐ง๐ฒ๐ฐ๐ต
In the AdTech use case, the goal is to let advertisers quickly spin up autonomous โbrand agentsโ that can chat with customers about specific products, e.g., different tire segments with distinct brand voices. Key challenges are brand voice consistency, safety and guardrails, asset management across copy and media, and strict cost-efficiency for high-volume interactions.
๐ ๐๐ฟ๐ฐ๐ต๐ถ๐๐ฒ๐ฐ๐๐๐ฟ๐ฒ ๐ฎ๐ป๐ฑ ๐ถ๐บ๐ฝ๐น๐ฒ๐บ๐ฒ๐ป๐๐ฎ๐๐ถ๐ผ๐ป
The solution uses AI to analyze existing advertiser assets and derive a structured โbrand voiceโ prompt plus configuration for guardrails, assets, and data access, stored in standard data stores.
๐ฏ ๐ข๐๐๐ฐ๐ผ๐บ๐ฒ ๐ฎ๐ป๐ฑ ๐ฟ๐ฒ๐๐๐ฒ
For the featured customer, this pattern turned a two-year backlog idea into a working system: advertisers can now configure agents that run inside ad placements and qualify leads via interactive chats on publisher sites. The talk closes by emphasizing that most of the solution is generic and repeatable, and points to an open GitHub repo[2] with the AI-DLC process artifacts so teams can adapt them to their own regulated environments.
Cross-posted to LinkedIn