Customers Don't Want Another Chatbot' - How A2UI Is Revolutionizing AI Interface
Customers Don’t Want Another Chatbot" - How A2UI Is Revolutionizing AI Interfaces
“And then we figured out that the customers didn’t want another chatbot. They neither tend to love reading nor writing that much” - paraphrasing Jakob Panten at the AWS Summit in Stockholm back in 2024.
๐๐ฒ ๐ฎ๐ป๐ฑ ๐ต๐ถ๐ ๐๐ฒ๐ฎ๐บ ๐ฝ๐ถ๐๐ผ๐๐ฒ๐ฑ ๐๐ผ ๐ฎ ๐ด๐ฒ๐ป๐ฒ๐ฟ๐ฎ๐๐ถ๐๐ฒ ๐๐-๐ฑ๐ฟ๐ถ๐๐ฒ๐ป ๐ฝ๐ฟ๐ผ๐ฑ๐๐ฐ๐ ๐ฐ๐ผ๐บ๐ฝ๐ฎ๐ฟ๐ถ๐๐ผ๐ป ๐๐ฎ๐ฏ๐น๐ฒ. This eliminated the need for users to craft long prompts while building concise, purpose-specific answers in the form of product comparison tables.
Written language is universal. We can express almost infinite things in written language, given we take time to think and write about specifics. This takes time.
We often help ourselves by introducing well-defined terms. Using them helps avoid re-specifying details. We can build on this abstraction and be more concise while reducing text length. This helps, but often isn’t efficient enough. That’s where we introduce specific terms and domain-specific languages.
๐ง๐ต๐ฒ ๐๐ต๐ฎ๐๐ฏ๐ผ๐ ๐๐ต๐ฎ๐น๐น๐ฒ๐ป๐ด๐ฒ: While language flexibility helps in building bots with wide functionality, this often requires extra work from humans writing free-form text, and might need multiple iterations to ensure all data is captured.
How did we do this before chatbots? While in Germany, you still fill out paper forms. These forms capture all relevant data. In the digital age, we use interfaces that not only capture data but provide human-friendly controls, like date-pickers, making it easier for humans to input information.
๐๐ป๐๐ฒ๐ฟ ๐๐ฎ๐จ๐: ๐ ๐๐ฎ๐บ๐ฒ-๐๐ต๐ฎ๐ป๐ด๐ฒ๐ฟ ๐ That’s where A2UI[1] comes into play: “Protocol for Agent-Driven Interfaces. A2UI enables AI agents to generate interactive user interfaces that render natively across web, mobile, and desktopโwithout executing arbitrary code.”
A2UI is created by Google and Apache 2.0 licensed. It allows agents to define interfaces on-the-fly and send them to the client. The client, with the A2UI SDK, can natively render the UI and ask users to provide the relevant data.
๐ช๐ต๐ ๐ง๐ต๐ถ๐ ๐ ๐ฎ๐๐๐ฒ๐ฟ๐: This maintains language flexibility while allowing more specificity on required inputs and providing humans with an easy way to input data.
The protocol uses a data format efficiently created by an LLM-powered agent. It can be parsed as well. So it might be another way to have agents talk to agents?
๐ก๐ฒ๐ฎ๐, ๐ฟ๐ถ๐ด๐ต๐?
If your head is spinning about protocols you just learned like A2A, MCP… don’t worry. They’re staying. This is complementary.
๐ฌ ๐๐ฎ๐๐ฒ ๐๐ผ๐ ๐๐ผ๐ฟ๐ธ๐ฒ๐ฑ ๐๐ถ๐๐ต ๐๐ฎ๐จ๐ ๐๐ฒ๐? ๐ฆ๐ต๐ฎ๐ฟ๐ฒ ๐๐ผ๐๐ฟ ๐ฒ๐ ๐ฝ๐ฒ๐ฟ๐ถ๐ฒ๐ป๐ฐ๐ฒ ๐ถ๐ป ๐๐ต๐ฒ ๐ฐ๐ผ๐บ๐บ๐ฒ๐ป๐๐!
๐ค ๐ช๐ต๐ฎ๐ ๐ถ๐ป๐๐ฒ๐ฟ๐ณ๐ฎ๐ฐ๐ฒ ๐ฐ๐ต๐ฎ๐น๐น๐ฒ๐ป๐ด๐ฒ๐ ๐ฎ๐ฟ๐ฒ ๐๐ผ๐ ๐ณ๐ฎ๐ฐ๐ถ๐ป๐ด ๐๐ถ๐๐ต ๐๐ผ๐๐ฟ ๐๐ ๐๐ผ๐น๐๐๐ถ๐ผ๐ป๐? ๐๐ฒ๐’๐ ๐ฑ๐ถ๐๐ฐ๐๐๐ ๐ต๐ผ๐ ๐๐ฎ๐จ๐ ๐บ๐ถ๐ด๐ต๐ ๐ต๐ฒ๐น๐ฝ.
Cross-posted to LinkedIn