Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Wear Are We?

by UX Magazine Staff
5 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

In the wake of a new implementation of brain-computer interface headgear, we asked three UX Magazine contributors about the trajectory of wearable technology.

Yesterday, Philips, Accenture, and Emotiv announced a proof-of-concept project involving a piece of headgear and an app that allows users to control Philips products using their brainwaves. The intent is to give patients with Lou Gehrig’s disease (ALS) and other neurological diseases greater independence—letting them make a call, send a pre-set text or email, or turn lights and television sets on and off.

Service design agency Fjord (part of Accenture Interactive) collaborated on the project, researching the functionality of heads up displays (HUD) along with the the Emotiv Insight, a wireless electroencephalography (EEG) headset that detects brain commands, micro-facial expressions, and emotional states.

According to Fjord, “After these sessions, it became clear that using Emotiv’s machine learning algorithms, users had the potential to train and improve their focus to complete actions faster through brain commands. Helping people complete tasks with ease and speed became a guiding principle for the proof of concept research project.”

 

Earlier this month, former New Orleans Saints safety Steve Gleason, who suffers from ALS, told a team of hackers at Microsoft: “Until there is a medical treatment or cure, technology can be a cure.” Gleason was featured prominently in a Windows ad during Super Bowl XLVIII using the Tobii EyeMobile, which allows him to control a tablet using just his gaze. This addition of workable brain-computer interface presents a pretty big leap forward for those afflicted with ALS but also for the potential of wearable technology in general.

“What works for people with limited mobility will also work for people who have full mobility but still could benefit from the right input options for a given context where they can’t type or speak,” says UX Magazine contributor Hunter Whitney. “The main concern with these technologies is that, while they continue to improve, they are often not all that great in terms of responsiveness.”

Having used this type of technology himself, Whitney notes that depending on aspects like the number of channels on the headset and signal processing capabilities, the interaction can be like trying to order something over a phone with a bad connection—something he addresses in his article, “Brain-Computer Interfaces: Interactions at the Speed of Thought.”

“EEGs are fairly easy to use for sensing brain activity, but they do have drawbacks. Imagine there’s a raucous party and you are standing in an adjoining room. If you put your ear up to the wall at different places, you may be able to discern a few clear patterns in all the noise. For example, you might know what kind of music is playing and even get a sense of some conversations. If you put a cup against the wall, you could get a clearer sense of what you are hearing. Similarly, our brains are filled with a cacophony of electrical signals, which are a fundamental part of how our brain cells communicate with each other.”

It’s interesting that this kind of technology is finally poking into the mainstream just as wearable technology is getting its toehold. While we might most readily associate “wearable” with interactive screens strapped to our wrists that tell us when to stop eating ice cream, something sleek that sits on a user’s head and reads brainwaves points to the emergence of an even deeper connection between humans and technology—one that could easily lead embedded technology.

Emotive Insight

“I don’t have any hard data to back this up, but there seems to be a big difference in perception and expectations between medical and consumer wearables,” says contributor Thomas Wendt. “Within the medical space, introducing new devices to improve quality of life doesn’t seem like a stretch … Right now, it’s the head band and its connected devices; but some time soon it might be an embedded technology. The question becomes whether embedding hardware into a patient’s brain to improve function is ethically different than providing a home health care worker.”

Wendt notes that we’ve already been implanting devices for years—pointing to the cochlear implant for the hearing impaired as the most obvious example. He says that while cochlear implantation has its opponents, it is mostly accepted as a means of restoring hearing. The big question is how we determine where there is a “lack.”

“A 30-year-old who loses an arm in a car accident feels a massive lack; something is missing that was once so mundane but important. Most of the opposition [to cochlear implants] comes from the deaf community. Many deaf people feel that deafness is not an impairment, and have built a culture around its language and norms. They view cochlear implants as denying one’s deafness and deaf culture, turning toward a ‘quick fix’ to amend what was not broken in the first place.”

Wearables also call out the difference between designing objects and designing the self

Ideas and arguments surrounding the cultural acceptance of technology are part of the fabric of experience design. After all, smartphones didn’t become ubiquitous until Apple made them so easy to use and understand that they appealed to everyone.

“What is the thing about our culture (society changing, not technology) that brings implantable from special case—morally ambiguous or otherwise—to available, common, and everyday?” asks contributor Steven Hoober. “I don’t know, and for many technologies I’d say ‘Who knows, maybe nothing?’ But, since implantable technology exists, I think something will, someday. Maybe it’s a really, really successful program along [the lines of what Philips is doing] that massively improves the lives of tens of thousands of people.”

Hoober says that while embedded technology is basically off the table at present, the exception comes in the form of medical devices (also referencing embedded hearing aids). “It’s not so much about the technology, but when, all of a sudden, things go from impossible (or immoral) to ubiquitous.”

As Hoober wrote in a recent article for UX Matters (“The Social History of the Smartphone”), “While digital natives are comfortable with technology, the question is: which technology, in which context? There are now more mobile phones on Earth than there are people! And most of these phones have cameras. Yet Google Glass feels invasive because of its ability to record video.”

While context is a huge differentiator, Wendt points out that, for experience designers, wearables also call out the difference between designing objects and designing the self.

“When we design a coffee maker, we think shape, context of use, etc. When we design wearables, the intimacy level is so much higher that we cannot avoid considering how these devices literally change who we are and our bodily engagement with the world … One buys a Fitbit because they desire to be seen as fitness-conscious, just as much as they seek truth in quantification. Their exercise routine or daily walks are an act of designing a better self, so the device simply becomes part of that ecosystem.”

What that ecosystem and the devices that inhabit it will look like 20, 10, or even five years from now is anyone’s guess.

Where do you see wearable tech taking us as users and experience designers in the near- and long-term? We’d love to hear from you in the comments section below.

post authorUX Magazine Staff

UX Magazine Staff, UX Magazine was created to be a central, one-stop resource for everything related to user experience. Our primary goal is to provide a steady stream of current, informative, and credible information about UX and related fields to enhance the professional and creative lives of UX practitioners and those exploring the field. Our content is driven and created by an impressive roster of experienced professionals who work in all areas of UX and cover the field from diverse angles and perspectives.

Tweet
Share
Post
Share
Email
Print

Related Articles

ChatGPT can identify and describe human emotions in hypothetical scenarios.

Article by Marlynn Wei
ChatGPT Outperforms Humans in Emotional Awareness Test
  • New research found ChatGPT was able to outperform humans on an emotional awareness test.
  • Emotional awareness is the cognitive ability to conceptualize one’s own and others’ emotions.
  • Given the psychological risks of artificial intimacy, AI augmentation, and human-AI collaboration—keeping humans in the loop—will be a safer and more beneficial approach for now.
Share:ChatGPT Outperforms Humans in Emotional Awareness Test
3 min read

What do Architecture, Computer Science, Agile, and Design Systems have in common?

Article by Kevin Muldoon
A Pattern Language
  • The article explores Christopher Alexander’s impact on diverse fields, from architecture to software development, introducing the concept of design patterns and their influence on methodologies like Agile and the evolution of Design Systems.
Share:A Pattern Language
7 min read

Since personal computing’s inception in the 80s, we’ve shifted from command-line to graphical user interfaces. The recent advent of conversational AI has reversed the ‘locus of control’: computers can now understand and respond in natural language. It’s shaping the future of UX.

Article by Jurgen Gravestein
How Conversational AI Is Shaping The Future of UX 
  • The article discusses the transformative impact of conversational AI on UX design, emphasizing the need for user-centric approaches and the emerging societal changes driven by AI technology.
Share:How Conversational AI Is Shaping The Future of UX 
3 min read

Did you know UX Magazine hosts the most popular podcast about conversational AI?

Listen to Invisible Machines

This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and