We stand with Ukraine and our team members from Ukraine.

The Community Of Over 578,000

Home ›› Design ›› Apple’s Proposed Multi-Touch UI System

Apple’s Proposed Multi-Touch UI System

by Luke Wroblewski
4 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

Looking behind the patents to see a coordinated proposal for a multi-touch UI.

A few years ago, I took stock of several Apple patents that opened up new interaction possibilities by rethinking the ways people could provide input through multi-touch, virtual interface controls, new physical controls, sensors, and more. Several of these including the “multi touch mouse” (now released as the Magic Mouse) have made their way into shipping Apple products.

Yesterday, in anticipation of Apple’s “latest creation,” Patently Apple compiled a similar list of Apple patents that may see the light of day soon. Looking through their article and at several additional patents from Apple, I compiled a list of the new interaction design capabilities these patents cover. In aggregate, these interactions began to look like an integrated system for managing applications and content.

The overarching UI model is a set of contextual virtual interface elements with audible and haptic (perhaps) feedback that are accessed and manipulated through multiple input formats. That’s a mouthful—let’s break it down.

Virtual interface elements

Virtual scroll wheels, slider bars, keyboards, dials, menus, and more are used to edit, manage, and input information on the screen. These controls are mostly shown overlaid or “floating” on top of content and applications. Some controls require specific touch gestures to be used and/or provide audible or tactile feedback when a user interacts with them. For example, a rotation gesture for virtual dials can be used to set volume and may include feedback when the limits of the dial are reached.

Included as part of the virtual controls are several forms of virtual keyboards and specifically a two-handed virtual keyboard that uses multipoint touch for input (deliberately called out as different from the iPod/iPhone texting keyboard).

Contextual interface elements

These virtual interface controls can be associated with specific user modes like navigation, scrolling, data entry, display, etc. So a virtual scroll wheel or slider bar may be associated with a scroll mode. A keyboard or keypad may be associated with data entry mode, and so on.

Controls can also be specific to the application a user currently has running. So a floating virtual panel for iTunes could include the controls you’ll use most often in the application like volume, playlist access, next song, etc.

Virtual controls can also be position sensitive. For example, selecting a song in iTunes could bring up specific controls for audio files with data associated with that file (e.g., title, artist, genre, etc.), or a page-turning gesture that allows you to move between pages of content could only be available at the bottom of the screen.

Accessed through multiple input formats

These virtual interface controls can be accessed through specific touch gestures or multi-touch inputs. For example, a virtual scroll wheel in iTunes could only appear when two fingers are placed on the touch screen as opposed to one finger. Additional fingers could be placed on the screen to modify or enhance the visible controls bringing up new interactions or information.

In fact, Apple has outlined a complete hand-based input system with “unprecedented integration of typing, resting, pointing, scrolling, 3D manipulation, and handwriting.” The system can individually detect all ten fingers and separate palms on a person’s hand, which allows it to detect resting of hands, measuring when a hand or fingers touches and leaves the surface, interpreting taps from one finger as mouse button clicks, but disregarding a tap from two fingers, and more.

The touch-sensitive areas that can accept this kind of input are not confined to the front or screen of a device. The back of a hardware device can also contain touch-sensitive areas that may be tapped, pressed, or slid to generate inputs.

Different hardware inputs can also bring up specific controls. Technologies that can recognize your thumb or fingerprints can be used as inputs for accessing virtual controls. Specifically, fingerprint patterns can be used to actually identify distinct fingers. This could then be used to display different functions depending on which finger is being used. Similarly, proximity sensors can detect when a hand is near a device and display the appropriate controls.

Apple has also proposed using object recognition, facial detection, and voice modulation as input.

Haptic Tactile Feedback (perhaps)

Finally, haptic responses can be used to provide feedback to users when they interact with a series of virtual controls. Haptic display technologies allow a user to “feel” different surfaces as their finger moves across a touchscreen. For example, a display could include a virtual click wheel which vibrates at a different frequency at the center. Users could easily sense the difference and use the click wheel without having to look at it.

In Summary…

Together, these proposals outline an integrated interaction model of virtual “floating” controls that are specific to the mode or application the system is in. The controls are accessed and manipulated through touch-based gestures, combinations of mutli-touch inputs, and/or inputs detected through sensors. Users get haptic, audible, and visual feedback when using these input methods to interact with the system’s set of virtual controls.

Needless to say, it will be interesting to see which of these proposals (if any) make their way into Apple‘s “latest creation” (tablet?) this month!

This article was syndicated from Luke Wroblewski‘s writings archive, Functioning Form.

post authorLuke Wroblewski

Luke Wroblewski, LukeW is an internationally recognized product design leader who has designed or contributed to software used by more than 700 million people worldwide. He is currently Chief Design Architect at Yahoo! Inc. where he works on forward-looking integrated customer experiences on the Web, mobile, TV, and beyond. Luke is the author of two popular Web design books: Web Form Design (2008) and Site-Seeing: A Visual Approach to Web Usability (2002). He also publishes Functioning Form, a leading online publication for interaction designers. Luke is consistently a top-rated speaker at conferences and companies around the world, and is a co-founder and former Board member of the Interaction Design Association (IxDA). Previously, Luke was the Lead User Interface Designer of eBay Inc.'s platform team, where he led the strategic design of new consumer products (such as eBay Express and Kijiji) and internal tools and processes. He also founded LukeW Interface Designs, a product strategy and design consultancy, taught interface design courses at the University of Illinois and worked as a Senior Interface Designer at the National Center for Supercomputing Applications (NCSA), the birthplace of the first popular graphical Web browser, NCSA Mosaic. Visit Luke's website: https://.lukew.com/ Or his writings at Functioning Form: https://.lukew.com/ff/ And follow Luke on Twitter: https://.twitter.com/lukewdesign

Tweet
Share
Post
Share
Email
Print

Related Articles

The project was to design a platform that educates and supports the wishes of those passing, as well as those who are left to mourn by using the design thinking process model.

YOU GOT THIS: An App Designed to Connect, Educate and Empower People Through Their Loss
  • The author designed a platform to educate and support the wishes of those passing, as well as those who are left to mourn.
  • The challenge was to understand the sensitive process of EOL (End of Life) Care and what individuals need.
  • The author’s idea was to create an app that would:
    • Perform daily check-ins
    • Provide resources
    • Provide tips
    • Connect people
  • The author’s approach to this challenge was based on the design thinking process model.
  • The author of this article unpacks the research process:
    • Qualitative analysis
    • Competitive analysis & industry standards
    • User interviews
    • Ideation
    • Prototyping
  • Key takeaways: 
    • Make MVP a key to staying focused
    • Keep iterating
Share:YOU GOT THIS: An App Designed to Connect, Educate and Empower People Through Their Loss
8 min read
YOU GOT THIS: An App Designed to Connect, Educate and Empower People Through Their Loss

An ultimate guide to conversational UX (CUX). Conversational UX principles.

Conversational Design
  • The author defines “conversational UX as a user experience that combines chat, voice or any other natural language-based technology to mimic a human conversation.”
  • The author looks at the following conversational UX Principles:
    •  Affordances
    • Signifiers
    • Feedback
  • Conversational user interface & principles:
    • Cooperative Principle (discover hidden intentions)
    • Turn-Taking (give users a space to interact)
    • Context-aware (in context / out of context)
  • While designing virtual assistants, the author suggests taking two things into consideration:
    1. How to set user expectations and educate users about what their assistants can do
    2. How to help these users
Share:Conversational Design
8 min read
Conversational Design

Co-design community offers opportunities to connect and exchange with others, using creative and participatory methods. Learn more about a new online network of practitioners wanting to learn more about participatory design that is open to people all over the world.

Why I’m Launching a Co-Design Community of Practice
  • The author shares her ideas and experiences that have inspired her to launch a co-design community of practice.
  • The article covers:
    • How CoDesignCo emerged
    • What difference this community aspires to create
    • The CoDesignCo offers opportunities to connect and exchange with others, using creative and participatory methods
Share:Why I’m Launching a Co-Design Community of Practice
5 min read
Why I’m launching a co-design community of practice

This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and