Generative and Malleable User Interfaces with Generative and Evolving Task-Driven Data Model
Problem Statement
Current generative UI approaches rely primarily on code generation, which creates a brittle abstraction layer that is difficult for non-technical end-users to modify iteratively. This limits adaptability to evolving user needs and tasks, and makes incremental customization through natural language or direct manipulation practically infeasible. There is a gap between AI-generated interfaces and true user agency over those interfaces.
Key Novelty
- Introduction of task-driven data models as a semantic intermediate representation between user prompts and UI generation, replacing raw code as the generative substrate
- A bidirectional mapping between data models and UI specifications that enables both initial generation and ongoing user-driven modifications via natural language and direct manipulation
- A malleable UI paradigm where end-user interactions (natural language edits and direct manipulation) are translated back into structured model changes, enabling iterative and evolving interface tailoring
Evaluation Highlights
- Technical evaluation demonstrated the feasibility and accuracy of AI-generated task-driven data models and their mapping to UI specifications
- User evaluation of the deployed system showed effectiveness and usability of the generative and malleable UI approach for supporting diverse user tasks and iterative customization
Breakthrough Assessment
Methodology
- User submits a natural language prompt describing their task; an AI model interprets this prompt and generates a structured task-driven data model capturing entities, relationships, and relevant data
- The task-driven data model is mapped to a UI specification using predefined or learned mappings, producing a rendered generative user interface tailored to the user's described task
- End-users interact with the interface via natural language commands or direct manipulation; these interactions are parsed and translated back into modifications of the underlying data model, which then propagates updates to the UI
System Components
A structured representation of the essential information entities, their relationships, and relevant data for a given user task, serving as the semantic foundation for UI generation
An AI module that parses user natural language prompts and generates the corresponding task-driven data model
A component that translates the abstract data model into concrete UI specifications and rendered interface components
A module that accepts natural language modification requests from end-users and converts them into targeted changes in the underlying data model
A UI layer that allows users to directly edit interface elements, with changes propagated back to the data model to maintain consistency
Results
| Metric/Benchmark | Baseline (Code-Gen UI) | This Paper (Data Model UI) | Delta |
|---|---|---|---|
| User ability to iteratively modify generated UI | Low (requires code editing) | High (natural language + direct manipulation) | Qualitative improvement |
| Feasibility of data model generation | N/A | Demonstrated feasible via technical eval | New capability |
| User-perceived effectiveness | Not evaluated | Positive user evaluation outcomes | Favorable |
| Support for evolving/changing user needs | Limited | Explicitly supported via model updates | Qualitative improvement |
Key Takeaways
- Using structured intermediate representations (data models) instead of raw code as the generative substrate for UIs significantly improves end-user modifiability and reduces the need for technical expertise
- Bidirectional mappings between semantic data models and UI specifications enable a powerful edit-propagation loop, a design pattern applicable to other AI-generated artifact domains beyond UIs
- For ML practitioners building generative UI or co-creative AI tools, incorporating a human-legible intermediate representation layer can be a key design principle to support iterative human-AI collaboration
Abstract
Unlike static and rigid user interfaces, generative and malleable user interfaces offer the potential to respond to diverse users’ goals and tasks. However, current approaches primarily rely on generating code, making it difficult for end-users to iteratively tailor the generated interface to their evolving needs. We propose employing task-driven data models—representing the essential information entities, relationships, and data within information tasks—as the foundation for UI generation. We leverage AI to interpret users’ prompts and generate the data models that describe users’ intended tasks, and by mapping the data models with UI specifications, we can create generative user interfaces. End-users can easily modify and extend the interfaces via natural language and direct manipulation, with these interactions translated into changes in the underlying model. The technical evaluation of our approach and user evaluation of the developed system demonstrate the feasibility and effectiveness of the proposed generative and malleable UIs.