Microsoft believes it has a fix for AI prompts being given, the response missing the mark, and the cycle repeating.
This inefficiency is a drain on resources. The “trial-and-error loop can feel unpredictable and discouraging,” turning what should be a productivity booster into a time sink. Knowledge workers often spend more time managing the interaction itself than understanding the material they hoped to learn.
Microsoft has released Promptions (prompt + options), a UI framework designed to address this friction by replacing vague natural language requests with precise, dynamic interface controls. The open-source tool offers a method to standardise how workforces interact with large language models (LLMs), moving away from unstructured chat toward guided and reliable workflows.
The comprehension bottleneck
Public attention often centres on AI producing text or images, but a massive component of enterprise usage involves understanding—asking AI to explain, clarify, or teach. This distinction is vital for internal tooling.
Consider a spreadsheet formula: one user may want a simple syntax breakdown, another a debugging guide, and another an explanation suitable for teaching colleagues. The same formula can require entirely different explanations depending on the user’s role, expertise, and goals.
Current chat interfaces rarely capture this intent effectively. Users often find that the way they phrase a question doesn’t match the level of detail the AI needs. “Clarifying what they really want can require long, carefully worded prompts that are tiring to produce,” Microsoft explains.
Promptions operates as a middleware layer to fix this familiar issue with AI prompts. Instead of forcing users to type lengthy specifications, the system analyses the intent and conversation history to generate clickable options – such as explanation length, tone, or specific focus areas – in real-time.
Efficiency vs complexity
Microsoft researchers tested this approach by comparing static controls against the new dynamic system. The findings offer a realistic view of how such tools function in a live environment.
Participants consistently reported that dynamic controls made it easier to express the specifics of their tasks without repeatedly rephrasing their prompts. This reduced the effort of prompt engineering and allowed users to focus more on understanding content than managing the mechanics of phrasing. By surfacing options like “Learning Objective” and “Response Format,” the system prompted participants to think more deliberately about their goals.
Yet, adoption brings trade-offs. Participants valued adaptability but also found the system more difficult to interpret. Some struggled to anticipate how a selected option would influence the response, noting that the controls seemed opaque because the effect became evident only after the output appeared.
This highlights a balance to strike. Dynamic interfaces can streamline complex tasks but may introduce a learning curve where the connection between a checkbox and the final output requires user adaptation.
Promptions: The solution to fix AI prompts?
Promptions is designed to be lightweight, functioning as a middleware layer sitting between the user and the underlying language model.
The architecture consists of two primary components:
- Option Module: Reviews the user’s prompt and conversation history to generate relevant UI elements.
- Chat Module: Incorporates these selections to produce the AI’s response.
Of particular note for security teams, “there’s no need to store data between sessions, which keeps implementation simple.” This stateless design mitigates data governance concerns typically associated with complex AI overlays.
Moving from “prompt engineering” to “prompt selection” offers a pathway to more consistent AI outputs across an organisation. By implementing UI frameworks that guide user intent, technology leaders can reduce the variability of AI responses and improve workforce efficiency.
Success depends on calibration. Usability challenges remain regarding how dynamic options affect AI output and managing the complexity of multiple controls. Leaders should view this not as a complete solution to fix the results of AI prompts, but as a design pattern to test within their internal developer platforms and support tools.
See also: Perplexity: AI agents are taking over complex enterprise tasks

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is part of TechEx and is co-located with other leading technology events. Click here for more information.
AI News is powered by TechForge Media. Explore other upcoming enterprise technology events and webinars here.
Comments