Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Get exclusive access to thought-provoking articles, bonus podcast content, and cutting-edge whitepapers. Become a member of the UX Magazine community today!

Home ›› UX ›› The Next Big AI-UX Trend — It’s not Conversational UI

The Next Big AI-UX Trend — It’s not Conversational UI

by Kshitij Agrawal
5 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

Imagine an operating system where all your apps communicate seamlessly, adapting to your context and needs. The article explores the concept of aiOS, highlighting four key values: dynamic interfaces, interoperable apps, context-aware functionality, and the idea that everything can be an input and output. This vision of AI-powered user experiences could revolutionize how we interact with technology, making it more intuitive and efficient. Is aiOS the future of user interfaces?

Everything is an input and everything is an output. What if you could browse ALL your things in ONE fluid interface?

AI’s like my 4-year-old nephew. Every week, he wants to be something new when he grows up…

One day it’s a soccer pro. The next day it’s an astronaut. Now, he just wants to be a garbage man.

AI’s similar. It has a ton of different narratives right now.

Human clone. Stalker. World domination. You name it.

Here’s exactly where we are today:

Conversational UX/Chat-styled interactions are what everyone’s making.

Some tasks that are possible through conversational UX are:

  • Fire-and-forget tasks like “play music”.
  • Specific trivia queries like “weather” and adding To-Dos.
  • A conversational partner like an AI girlfriend.

But there are many problems with conversational UX:

  • People land on an empty screen and then try to decipher what can be done.
  • People use apps that keep track of their state.
  • Editing: whether it is a video, audio, or article, you need to store the draft version to come back to later on.
  • Travel planning: tracking what places you’ve seen and which bookings you’ve already made.
  • Researching: opening 50 tabs to keep track of the different directions you’re exploring.

So the next question is obvious: What’s after ChatGPT? Are we meant to be prompt designers, prompt engineers, or prompters?

Here’s where we are headed in 2030:

There are 4 AI innovation trends that are taking place under our noses:

  • Dynamic Interfaces
  • Ephemeral Interfaces
  • aiOS ← today’s post
  • Screen-less UX

The other three trends are for another time 😉

A look into aiOS: What is it?

aiOS whispered into your ears, giving you goosebumps (Image source: author)

There are many definitions of the term ‘aiOS’, but the most basic one is an operating system powered by AI.

Seems obvious, right?

Jordan Singer, doing something with AI Figma, described it as a UX controlled only by conversations.

But conversations are just one medium.

There can be other ways of interacting within aiOS.

The pull-to-refresh type of intuitive interaction is still TBD.

Irrespective of the interaction, the underlying values for aiOS are going to remain the same. Let’s dive into the 4 major aiOS values:


1. You don’t go out; it comes to you

Us to the internet before AI “I will find you, and I will…“ (Image source: author)

It’s about bringing everything to you, as a user.

  • At the app level, it can be through chatbots.
  • At the inter-app level, it can be through Adept, ideally just explaining in a chat what you want to be done, and the AI does it for you.
  • At the browser level, it can be through Arc Search; you just search, and the browser browses for you.
  • And now, zooming out further, how would it look at the OS level?
  • And now, zooming out further, how would it look at the hardware level?

2. Interoperable apps

Apps that can communicate with each other

Let’s say you’re a freelance copywriter starting your ⛅️ Monday morning.

> You start by listening to a podcast that you’d scheduled last night.

> You take notes on the side.

> You open your emails, ready to send out an important email to your client.

> You leave the email mid-way to get a coffee to freshen up.

> You open your calendar to put in some time with another client.

> You pause the podcast.

> You open your email app to continue writing the mail.

> You get a notification on Teams. You respond with a file.

> You respond again with a link to the file.

It’s lunchtime.

Phew, a lot of switching between apps. Now, what if you could browse ALL your things in ONE fluid interface?

The answer? Itemized workspaces.

All apps are items or features.

You can drag and drop your podcast episode into your Notes app. Not as a reference, but the episode itself. You can drag and drop your half-written email into your notes to come back to again. You can drag and drop the flight you want into your calendar, and it’s booked.

Any app or item can be pulled into any other app or item.

It’s all intuitive, much faster, and clearer.


2.1 Built-in OS-level solutions

Bringing apps to the OS level has been happening since the beginning of time. When the app store was launched, its basic features were stand-alone apps.

For example, the flashlight apps.

Flashlight apps in App Store vs. Now in OS (Left image source: App Store, right image source: author‘s smartphone’s screenshot)

Similarly, Grammarly or ChatGPT that help us write better (with auto-correct or text prediction) don’t need to be at the app level, right? It could easily be at the OS level, built into the keyboard.


3. Context is foundational

The problem with current AI applications (read: conversational AI-UX) is that they aren’t in the same context as the user.

In MS Excel, the chatbot doesn’t have the complete context of what you’re working on, what’s your working style, or even your completion deadlines.
A screenshot of MS Excel’s AI chatbot (Image source: aiverse.design)

A simple application of this in the traditional setting (apps and websites) would be: What if websites had the context of how many times you’ve visited?

You can adapt the UI based on the visits.

Now imagine scaling this at the OS level.

A good example of this is what if your input method is determined by how you’re positioned with the device?

  • > If you’re looking at your laptop, the input is via keyboard.
  • > If you’re looking away or standing away from your laptop, the input is via audio.

A lovely demo by the cofounders of New Computer shows just this!
Two states of a website, an original concept by New Computer’s founder (Image source: AIE summit 2023)

Adding context for the user from outside the bounds of the app or website makes the user experience much more intuitive and faster.


4. Everything is an input

You might have guessed this one. It overlaps with the above value.

AI has hacked the operating system of a human being—language.
~ Yuval Noah Harari

And because AI can understand language, essentially conversations, it also understands all the mediums of communication, i.e., voice, visual, and text.

So now everything is an input, and everything is an output.

You can input text and get an output as a visual—without having to choose if that’s the best medium, AI does that for you.

Have you checked out ChatGPT’s voice-read-aloud feature? It. is. so. freaking. real 🤯 It pauses, breathes, and speaks just like a human. You gotta try!
(Image source: author)

And that’s it; those are the 4 values being considered to create aiOS. So what do you think…

…is AI-powered OS the next big thing?

On a completely random note, if aiOS was a movie:

If aiOS was a movie, cover image (Image source: author)

You can find more of the author’s articles on Voyager, a blog about AI and design.

The article originally appeared on Medium.

Featured image courtesy: Mika Baumeister.

post authorKshitij Agrawal

Kshitij Agrawal
A designer exploring the AI-UX universe. I created aiverse.design for designers and innovators, featuring a collection of 100+ AI-UX interactions from companies designing for AI. I'm currently exploring the bridge between AI x Design and publicly sharing my learnings.

Tweet
Share
Post
Share
Email
Print
Ideas In Brief
  • The article explores the concept of an AI-powered operating system (aiOS), emphasizing dynamic interfaces, interoperable apps, context-aware functionality, and the idea that all interactions can serve as inputs and outputs.
  • It envisions a future where AI simplifies user experiences by seamlessly integrating apps and data, making interactions more intuitive and efficient.
  • The article suggests that aiOS could revolutionize how we interact with technology, bringing a more cohesive and intelligent user experience.

Related Articles

Discover how AI-powered gesture-based navigation is redefining app experiences, making interactions more intuitive and personalized. Explore the opportunities and challenges of this design revolution.

Article by Kevin Gates
Designing Serendipity
  • This article explores the role of AI in enhancing app navigation through gesture-based interactions, emphasizing a shift from traditional menus to intuitive, swipe-driven experiences.
  • It examines the intersection of AI and interaction design, highlighting how machine learning can support user discovery by anticipating needs and surfacing relevant content.
  • The piece critically assesses the potential of gesture-based navigation to improve accessibility, user engagement, and overall app usability, while addressing design challenges and potential pitfalls.
Share:Designing Serendipity
11 min read

Discover how AI is changing UX research. It’s not just making data analysis faster. It’s also encouraging people to think more deeply. Learn how to strike a balance between human insight and AI-driven efficiency to create more thoughtful designs.

Article by Charles Gedeon
How AI and Metacognition Are Shaping UX Research
  • The article talks about how AI can speed up data analysis and encourage people to think more deeply about biases and missed insights, which can improve the quality of user-centered design.
  • It shows that AI-powered UX research tools need to include reflection checkpoints. These checkpoints let researchers critically assess their assumptions and conclusions.
  • The piece highlights the collaboration between AI’s ability to recognize patterns and human judgment to make sure the research outcomes are meaningful and consider the context.
Share:How AI and Metacognition Are Shaping UX Research
4 min read

How can thoughtful workspace design transform collaboration and creativity? Discover how a human-centered approach reimagined 21,940 square feet into a flexible, inspiring environment that employees love.

Article by Aalap Doshi
Rethink Space: Designing a Human-Centered Workspace that Supports Flexibility, Collaboration, Privacy, Innovation, Creativity, and Transparency
  • The article explores how human-centered workspace design can improve collaboration, flexibility, and creativity by addressing employee needs.
  • It highlights solutions like open zones, quiet spaces, and pod-like configurations, showing how these changes boosted teamwork and morale.
  • The piece emphasizes the value of co-creation, adaptability, and clear communication in rethinking office spaces.
Share:Rethink Space: Designing a Human-Centered Workspace that Supports Flexibility, Collaboration, Privacy, Innovation, Creativity, and Transparency
7 min read

Join the UX Magazine community!

Stay informed with exclusive content on the intersection of UX, AI agents, and agentic automation—essential reading for future-focused professionals.

Hello!

You're officially a member of the UX Magazine Community.
We're excited to have you with us!

Thank you!

To begin viewing member content, please verify your email.

Tell us about you. Enroll in the course.

    This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and