UI2 Documentation
Introducing the Unified Intent Interface
A UI2-powered todo app, showing Intent Detection and Instant Preview:
UI2 seeks to directly convert user intention into action. Think if an assistant like Siri was implemented directly into an application—but you can type to it, and due to its deep integration, it's accurate, powerful, and fast.
And not only that, you can see exactly what that assistant would do as you are typing.
Intent detection also represent a leap forward when it comes to interfaces for future mobile devices like AR/VR where typing is difficult.
The Problem
The idea of the User Interface has always been the same since it's creation.
- You have an intention to do something
- You formulate how to translate that Intention into Actions
- You take those actions on the UI to achieve your goal
But imagine a world where there is no barrier between intention and action. One where you can directly express your Intent—and have the computer handle the rest for you.
That very idea is UI2.
What is UI2?
UI2, or the Unified Intent Interface is a framework for a natural-language interface to any application.
It is powered by the idea of Intent Identification, merging all the intents in your app—searching, taking action, and more—into a single natural language interface.
UI2 can be described with four core ideas:
- Unification: Search, taking action, and everything else you can do on an application should not be separate interfaces, but rather unified into one.
- Intent Detection: The key point in combining these interfaces is identifying the user intent, now possible through LLMs
- Context: Not only is intent detection based off of the text that you are typing, it also draws context from everything in your app—Allowing more powerful and more accurate intents.
- Instant Preview: Imagine autocomplete for actions—not just words, in which you can "preview-before-committing." Learn more about this key principle of UI2.
The Future of UI2
The idea of translating intent directly to action can be extremely applicable in phones or even future mobile devices such as AR Glasses where typing is tedious and voice input is more prevalent.
While currently UI2 only works with text input, it can be extended into voice input, with the same powerful ideas of Unified Intents, Context, and Instant Preview.
Instant Preview: UI2 vs Chatbot
The idea of converting natural language into action is something existing chatbots can already do. But UI2 hopes to take this a step further by addressing a key issue with existing implementations: Uncertainty.
Possibly the most important principle of UI2 is Instant Preview: You can see the intents detected as you type.
Unlike a chatbot with which you are not sure about what it could do, UI2 gives you the constant feedback of standard UI while achieving the fluidity of the natural language interface.
What's Available
UI2 is currently only available as a JavaScript/TypeScript library.
UI2 is full of various features to help you identify intents for your app.
- Standalone Library: You can use UI2 to directly identify intents. No UI or framework attached.
- Stateful Implementation: You can use a Stateful version of UI2, built for frontend, to link to any input box with debounce and async processing built in.
- React Hook: Built on the Stateful Implementation, you can use the React hook to easily connect to your projects.
Building with UI2
Intent identification is undoubtedly the future of the User Interface.
But the most important part is what you can build with it. So, let's get right into it.