← Back to Papers

Generative and Malleable User Interfaces with Generative and Evolving Task-Driven Data Model

Yining Cao, Peiling Jiang, Haijun Xia
International Conference on Human Factors in Computing Systems | 2025
Task-driven data models—structured representations of information entities, relationships, and data—can serve as an intermediate layer between user intent and UI generation, enabling generative interfaces that are both dynamically created and iteratively malleable by end-users.

Problem Statement

Current generative UI approaches rely primarily on code generation, which creates a brittle abstraction layer that is difficult for non-technical end-users to modify iteratively. This limits adaptability to evolving user needs and tasks, and makes incremental customization through natural language or direct manipulation practically infeasible. There is a gap between AI-generated interfaces and true user agency over those interfaces.

Key Novelty

  • Introduction of task-driven data models as a semantic intermediate representation between user prompts and UI generation, replacing raw code as the generative substrate
  • A bidirectional mapping between data models and UI specifications that enables both initial generation and ongoing user-driven modifications via natural language and direct manipulation
  • A malleable UI paradigm where end-user interactions (natural language edits and direct manipulation) are translated back into structured model changes, enabling iterative and evolving interface tailoring

Evaluation Highlights

  • Technical evaluation demonstrated the feasibility and accuracy of AI-generated task-driven data models and their mapping to UI specifications
  • User evaluation of the deployed system showed effectiveness and usability of the generative and malleable UI approach for supporting diverse user tasks and iterative customization

Breakthrough Assessment

6/10 The paper introduces a meaningful architectural innovation—task-driven data models as a malleable intermediate layer—that addresses a real limitation in generative UI systems, but it is an applied HCI/systems contribution rather than a fundamental ML advance, and its impact scope is primarily within the CHI/UI community.

Methodology

  1. User submits a natural language prompt describing their task; an AI model interprets this prompt and generates a structured task-driven data model capturing entities, relationships, and relevant data
  2. The task-driven data model is mapped to a UI specification using predefined or learned mappings, producing a rendered generative user interface tailored to the user's described task
  3. End-users interact with the interface via natural language commands or direct manipulation; these interactions are parsed and translated back into modifications of the underlying data model, which then propagates updates to the UI

System Components

Task-Driven Data Model

A structured representation of the essential information entities, their relationships, and relevant data for a given user task, serving as the semantic foundation for UI generation

AI Prompt Interpreter

An AI module that parses user natural language prompts and generates the corresponding task-driven data model

Data Model to UI Mapper

A component that translates the abstract data model into concrete UI specifications and rendered interface components

Natural Language Interaction Handler

A module that accepts natural language modification requests from end-users and converts them into targeted changes in the underlying data model

Direct Manipulation Interface

A UI layer that allows users to directly edit interface elements, with changes propagated back to the data model to maintain consistency

Results

Metric/Benchmark Baseline (Code-Gen UI) This Paper (Data Model UI) Delta
User ability to iteratively modify generated UI Low (requires code editing) High (natural language + direct manipulation) Qualitative improvement
Feasibility of data model generation N/A Demonstrated feasible via technical eval New capability
User-perceived effectiveness Not evaluated Positive user evaluation outcomes Favorable
Support for evolving/changing user needs Limited Explicitly supported via model updates Qualitative improvement

Key Takeaways

  • Using structured intermediate representations (data models) instead of raw code as the generative substrate for UIs significantly improves end-user modifiability and reduces the need for technical expertise
  • Bidirectional mappings between semantic data models and UI specifications enable a powerful edit-propagation loop, a design pattern applicable to other AI-generated artifact domains beyond UIs
  • For ML practitioners building generative UI or co-creative AI tools, incorporating a human-legible intermediate representation layer can be a key design principle to support iterative human-AI collaboration

Abstract

Unlike static and rigid user interfaces, generative and malleable user interfaces offer the potential to respond to diverse users’ goals and tasks. However, current approaches primarily rely on generating code, making it difficult for end-users to iteratively tailor the generated interface to their evolving needs. We propose employing task-driven data models—representing the essential information entities, relationships, and data within information tasks—as the foundation for UI generation. We leverage AI to interpret users’ prompts and generate the data models that describe users’ intended tasks, and by mapping the data models with UI specifications, we can create generative user interfaces. End-users can easily modify and extend the interfaces via natural language and direct manipulation, with these interactions translated into changes in the underlying model. The technical evaluation of our approach and user evaluation of the developed system demonstrate the feasibility and effectiveness of the proposed generative and malleable UIs.

Generated on 2026-02-21 using Claude