Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Get exclusive access to thought-provoking articles, bonus podcast content, and cutting-edge whitepapers. Become a member of the UX Magazine community today!

Home ›› Design ›› Apple’s Proposed Multi-Touch UI System

Apple’s Proposed Multi-Touch UI System

by Luke Wroblewski
4 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

Looking behind the patents to see a coordinated proposal for a multi-touch UI.

A few years ago, I took stock of several Apple patents that opened up new interaction possibilities by rethinking the ways people could provide input through multi-touch, virtual interface controls, new physical controls, sensors, and more. Several of these including the “multi touch mouse” (now released as the Magic Mouse) have made their way into shipping Apple products.

Yesterday, in anticipation of Apple’s “latest creation,” Patently Apple compiled a similar list of Apple patents that may see the light of day soon. Looking through their article and at several additional patents from Apple, I compiled a list of the new interaction design capabilities these patents cover. In aggregate, these interactions began to look like an integrated system for managing applications and content.

The overarching UI model is a set of contextual virtual interface elements with audible and haptic (perhaps) feedback that are accessed and manipulated through multiple input formats. That’s a mouthful—let’s break it down.

Virtual interface elements

Virtual scroll wheels, slider bars, keyboards, dials, menus, and more are used to edit, manage, and input information on the screen. These controls are mostly shown overlaid or “floating” on top of content and applications. Some controls require specific touch gestures to be used and/or provide audible or tactile feedback when a user interacts with them. For example, a rotation gesture for virtual dials can be used to set volume and may include feedback when the limits of the dial are reached.

Included as part of the virtual controls are several forms of virtual keyboards and specifically a two-handed virtual keyboard that uses multipoint touch for input (deliberately called out as different from the iPod/iPhone texting keyboard).

Contextual interface elements

These virtual interface controls can be associated with specific user modes like navigation, scrolling, data entry, display, etc. So a virtual scroll wheel or slider bar may be associated with a scroll mode. A keyboard or keypad may be associated with data entry mode, and so on.

Controls can also be specific to the application a user currently has running. So a floating virtual panel for iTunes could include the controls you’ll use most often in the application like volume, playlist access, next song, etc.

Virtual controls can also be position sensitive. For example, selecting a song in iTunes could bring up specific controls for audio files with data associated with that file (e.g., title, artist, genre, etc.), or a page-turning gesture that allows you to move between pages of content could only be available at the bottom of the screen.

Accessed through multiple input formats

These virtual interface controls can be accessed through specific touch gestures or multi-touch inputs. For example, a virtual scroll wheel in iTunes could only appear when two fingers are placed on the touch screen as opposed to one finger. Additional fingers could be placed on the screen to modify or enhance the visible controls bringing up new interactions or information.

In fact, Apple has outlined a complete hand-based input system with “unprecedented integration of typing, resting, pointing, scrolling, 3D manipulation, and handwriting.” The system can individually detect all ten fingers and separate palms on a person’s hand, which allows it to detect resting of hands, measuring when a hand or fingers touches and leaves the surface, interpreting taps from one finger as mouse button clicks, but disregarding a tap from two fingers, and more.

The touch-sensitive areas that can accept this kind of input are not confined to the front or screen of a device. The back of a hardware device can also contain touch-sensitive areas that may be tapped, pressed, or slid to generate inputs.

Different hardware inputs can also bring up specific controls. Technologies that can recognize your thumb or fingerprints can be used as inputs for accessing virtual controls. Specifically, fingerprint patterns can be used to actually identify distinct fingers. This could then be used to display different functions depending on which finger is being used. Similarly, proximity sensors can detect when a hand is near a device and display the appropriate controls.

Apple has also proposed using object recognition, facial detection, and voice modulation as input.

Haptic Tactile Feedback (perhaps)

Finally, haptic responses can be used to provide feedback to users when they interact with a series of virtual controls. Haptic display technologies allow a user to “feel” different surfaces as their finger moves across a touchscreen. For example, a display could include a virtual click wheel which vibrates at a different frequency at the center. Users could easily sense the difference and use the click wheel without having to look at it.

In Summary…

Together, these proposals outline an integrated interaction model of virtual “floating” controls that are specific to the mode or application the system is in. The controls are accessed and manipulated through touch-based gestures, combinations of mutli-touch inputs, and/or inputs detected through sensors. Users get haptic, audible, and visual feedback when using these input methods to interact with the system’s set of virtual controls.

Needless to say, it will be interesting to see which of these proposals (if any) make their way into Apple‘s “latest creation” (tablet?) this month!

This article was syndicated from Luke Wroblewski‘s writings archive, Functioning Form.

post authorLuke Wroblewski

Luke Wroblewski
LukeW is an internationally recognized product design leader who has designed or contributed to software used by more than 700 million people worldwide. He is currently Chief Design Architect at Yahoo! Inc. where he works on forward-looking integrated customer experiences on the Web, mobile, TV, and beyond. Luke is the author of two popular Web design books: Web Form Design (2008) and Site-Seeing: A Visual Approach to Web Usability (2002). He also publishes Functioning Form, a leading online publication for interaction designers. Luke is consistently a top-rated speaker at conferences and companies around the world, and is a co-founder and former Board member of the Interaction Design Association (IxDA). Previously, Luke was the Lead User Interface Designer of eBay Inc.'s platform team, where he led the strategic design of new consumer products (such as eBay Express and Kijiji) and internal tools and processes. He also founded LukeW Interface Designs, a product strategy and design consultancy, taught interface design courses at the University of Illinois and worked as a Senior Interface Designer at the National Center for Supercomputing Applications (NCSA), the birthplace of the first popular graphical Web browser, NCSA Mosaic. Visit Luke's website: https://.lukew.com/ Or his writings at Functioning Form: https://.lukew.com/ff/ And follow Luke on Twitter: https://.twitter.com/lukewdesign

Tweet
Share
Post
Share
Email
Print

Related Articles

Discover how digital twins are transforming industries by enabling innovation and reducing waste. This article delves into the power of digital twins to create virtual replicas, allowing companies to improve products, processes, and sustainability efforts before physical resources are used. Read on to see how this cutting-edge technology helps streamline operations and drive smarter, eco-friendly decisions

Article by Alla Slesarenko
How Digital Twins Drive Innovation and Minimize Waste
  • The article explores how digital twins—virtual models of physical objects—enable organizations to drive innovation by allowing testing and improvements before physical implementation.
  • It discusses how digital twins can minimize waste and increase efficiency by identifying potential issues early, ultimately optimizing resource use.
  • The piece emphasizes the role of digital twins in various sectors, showcasing their capacity to improve processes, product development, and sustainability initiatives.
Share:How Digital Twins Drive Innovation and Minimize Waste
5 min read

Discover how venture capital firms are shaping the future of product design — and why experienced design leaders need to be consulted to ensure creativity and strategy aren’t left behind. This article delves into the power VCs hold in talent acquisition and team dynamics, highlighting the need for a collaborative approach to foster true innovation.

Article by Darren Smith
How Venture Capital Firms Are Shaping the Future of Product Design, & Why Design Leaders Need to Be Part of the Solution
  • The article explores how venture capital (VC) firms shape product design by providing startups with critical resources like funding, strategic advice, and network access, but often lack an understanding of design’s strategic value.
  • It discusses the impact of VC-led hiring practices in design, which can lead to misaligned job roles, undervalued design leadership, and teams focused more on output than innovation.
  • The piece calls for a collaborative approach where design leaders work alongside VCs in talent acquisition and strategic planning, establishing design as a key partner to drive product innovation and long-term brand success.
Share:How Venture Capital Firms Are Shaping the Future of Product Design, & Why Design Leaders Need to Be Part of the Solution
8 min read

Discover the journey of design systems — from the modularity of early industrial and printing innovations to today’s digital frameworks that shape user experiences. This article reveals how design systems evolved into powerful tools for cohesive branding, efficient scaling, and unified collaboration across design and development teams. Dive into the history and future of design systems!

Article by Jim Gulsen
A Brief History of Design Systems. Part 1
  • The article offers a historical perspective on design systems, tracing their origins from early modularity concepts in industrial design to the digital era, where they have become essential for consistent user experiences.
  • It highlights the evolution of design systems as organizations sought ways to streamline UI and UX elements, allowing teams to maintain cohesive branding while speeding up development.
  • The piece draws parallels between the development of design systems and pivotal moments in history, especially in print technology, where breakthroughs transformed access and consistency. These precedents show how modern design systems evolved into essential tools for business value.
  • It emphasizes how modern design systems empower teams to scale efficiently, fostering a shared language among designers and developers, and promoting a user-centered approach that benefits both businesses and end-users.
Share:A Brief History of Design Systems. Part 1
16 min read

Join the UX Magazine community!

Stay informed with exclusive content on the intersection of UX, AI agents, and agentic automation—essential reading for future-focused professionals.

Hello!

You're officially a member of the UX Magazine Community.
We're excited to have you with us!

Thank you!

To begin viewing member content, please verify your email.

Tell us about you. Enroll in the course.

    This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and