Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Get exclusive access to thought-provoking articles, bonus podcast content, and cutting-edge whitepapers. Become a member of the UX Magazine community today!

Home ›› Design ›› Apple’s Proposed Multi-Touch UI System

Apple’s Proposed Multi-Touch UI System

by Luke Wroblewski
4 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

Looking behind the patents to see a coordinated proposal for a multi-touch UI.

A few years ago, I took stock of several Apple patents that opened up new interaction possibilities by rethinking the ways people could provide input through multi-touch, virtual interface controls, new physical controls, sensors, and more. Several of these including the “multi touch mouse” (now released as the Magic Mouse) have made their way into shipping Apple products.

Yesterday, in anticipation of Apple’s “latest creation,” Patently Apple compiled a similar list of Apple patents that may see the light of day soon. Looking through their article and at several additional patents from Apple, I compiled a list of the new interaction design capabilities these patents cover. In aggregate, these interactions began to look like an integrated system for managing applications and content.

The overarching UI model is a set of contextual virtual interface elements with audible and haptic (perhaps) feedback that are accessed and manipulated through multiple input formats. That’s a mouthful—let’s break it down.

Virtual interface elements

Virtual scroll wheels, slider bars, keyboards, dials, menus, and more are used to edit, manage, and input information on the screen. These controls are mostly shown overlaid or “floating” on top of content and applications. Some controls require specific touch gestures to be used and/or provide audible or tactile feedback when a user interacts with them. For example, a rotation gesture for virtual dials can be used to set volume and may include feedback when the limits of the dial are reached.

Included as part of the virtual controls are several forms of virtual keyboards and specifically a two-handed virtual keyboard that uses multipoint touch for input (deliberately called out as different from the iPod/iPhone texting keyboard).

Contextual interface elements

These virtual interface controls can be associated with specific user modes like navigation, scrolling, data entry, display, etc. So a virtual scroll wheel or slider bar may be associated with a scroll mode. A keyboard or keypad may be associated with data entry mode, and so on.

Controls can also be specific to the application a user currently has running. So a floating virtual panel for iTunes could include the controls you’ll use most often in the application like volume, playlist access, next song, etc.

Virtual controls can also be position sensitive. For example, selecting a song in iTunes could bring up specific controls for audio files with data associated with that file (e.g., title, artist, genre, etc.), or a page-turning gesture that allows you to move between pages of content could only be available at the bottom of the screen.

Accessed through multiple input formats

These virtual interface controls can be accessed through specific touch gestures or multi-touch inputs. For example, a virtual scroll wheel in iTunes could only appear when two fingers are placed on the touch screen as opposed to one finger. Additional fingers could be placed on the screen to modify or enhance the visible controls bringing up new interactions or information.

In fact, Apple has outlined a complete hand-based input system with “unprecedented integration of typing, resting, pointing, scrolling, 3D manipulation, and handwriting.” The system can individually detect all ten fingers and separate palms on a person’s hand, which allows it to detect resting of hands, measuring when a hand or fingers touches and leaves the surface, interpreting taps from one finger as mouse button clicks, but disregarding a tap from two fingers, and more.

The touch-sensitive areas that can accept this kind of input are not confined to the front or screen of a device. The back of a hardware device can also contain touch-sensitive areas that may be tapped, pressed, or slid to generate inputs.

Different hardware inputs can also bring up specific controls. Technologies that can recognize your thumb or fingerprints can be used as inputs for accessing virtual controls. Specifically, fingerprint patterns can be used to actually identify distinct fingers. This could then be used to display different functions depending on which finger is being used. Similarly, proximity sensors can detect when a hand is near a device and display the appropriate controls.

Apple has also proposed using object recognition, facial detection, and voice modulation as input.

Haptic Tactile Feedback (perhaps)

Finally, haptic responses can be used to provide feedback to users when they interact with a series of virtual controls. Haptic display technologies allow a user to “feel” different surfaces as their finger moves across a touchscreen. For example, a display could include a virtual click wheel which vibrates at a different frequency at the center. Users could easily sense the difference and use the click wheel without having to look at it.

In Summary…

Together, these proposals outline an integrated interaction model of virtual “floating” controls that are specific to the mode or application the system is in. The controls are accessed and manipulated through touch-based gestures, combinations of mutli-touch inputs, and/or inputs detected through sensors. Users get haptic, audible, and visual feedback when using these input methods to interact with the system’s set of virtual controls.

Needless to say, it will be interesting to see which of these proposals (if any) make their way into Apple‘s “latest creation” (tablet?) this month!

This article was syndicated from Luke Wroblewski‘s writings archive, Functioning Form.

post authorLuke Wroblewski

Luke Wroblewski
LukeW is an internationally recognized product design leader who has designed or contributed to software used by more than 700 million people worldwide. He is currently Chief Design Architect at Yahoo! Inc. where he works on forward-looking integrated customer experiences on the Web, mobile, TV, and beyond. Luke is the author of two popular Web design books: Web Form Design (2008) and Site-Seeing: A Visual Approach to Web Usability (2002). He also publishes Functioning Form, a leading online publication for interaction designers. Luke is consistently a top-rated speaker at conferences and companies around the world, and is a co-founder and former Board member of the Interaction Design Association (IxDA). Previously, Luke was the Lead User Interface Designer of eBay Inc.'s platform team, where he led the strategic design of new consumer products (such as eBay Express and Kijiji) and internal tools and processes. He also founded LukeW Interface Designs, a product strategy and design consultancy, taught interface design courses at the University of Illinois and worked as a Senior Interface Designer at the National Center for Supercomputing Applications (NCSA), the birthplace of the first popular graphical Web browser, NCSA Mosaic. Visit Luke's website: https://.lukew.com/ Or his writings at Functioning Form: https://.lukew.com/ff/ And follow Luke on Twitter: https://.twitter.com/lukewdesign

Tweet
Share
Post
Share
Email
Print

Related Articles

Trusting AI isn’t the goal — relying on it is. This article explores why human trust and AI reliance are worlds apart, and what UX designers should focus on to make AI feel dependable, not human.

Article by Verena Seibert-Giller
The Psychology of Trust in AI: Why “Relying on AI” Matters More than “Trusting It”
  • The article argues that “reliance,” not “trust,” is the right way to think about users’ relationship with AI.
  • It explains that human trust and AI reliance are driven by different psychological mechanisms.
  • The piece highlights that predictability, transparency, and control make users more willing to rely on AI.
  • It concludes that users don’t need to trust AI as a partner — only rely on it as a dependable tool.
Share:The Psychology of Trust in AI: Why “Relying on AI” Matters More than “Trusting It”
4 min read

What if your productivity app could keep you as focused as your favorite game? This article explores how game design psychology can transform everyday tools into experiences that spark flow, focus, and real engagement.

Article by Montgomery Singman
Flow State Design: Applying Game Psychology to Productivity Apps
  • The article shows how principles from game design can help productivity tools create and sustain a flow state.
  • It explains that games succeed by balancing challenge and skill, providing clear goals, and offering immediate feedback — elements most productivity apps lack.
  • The piece argues that applying these psychological insights could make work tools more engaging, adaptive, and motivating.
Share:Flow State Design: Applying Game Psychology to Productivity Apps
12 min read

Learn how understanding user emotions can create intuitive, supportive designs that build trust and loyalty.

Article by Pavel Bukengolts
The Role of Emotion in UX: Embracing Emotionally Intelligent Design
  • The article emphasizes that emotionally intelligent design is key to creating meaningful UX that satisfies users and drives business success.
  • It shows how understanding users’ emotions — through research, empathy mapping, journey mapping, and service blueprinting — can reveal hidden needs and shape more intuitive, reassuring digital experiences.
  • The piece argues that embedding empathy and emotional insights into design strengthens user engagement, loyalty, and overall satisfaction.
Share:The Role of Emotion in UX: Embracing Emotionally Intelligent Design
5 min read

Join the UX Magazine community!

Stay informed with exclusive content on the intersection of UX, AI agents, and agentic automation—essential reading for future-focused professionals.

Hello!

You're officially a member of the UX Magazine Community.
We're excited to have you with us!

Thank you!

To begin viewing member content, please verify your email.

Tell us about you. Enroll in the course.

    This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and