Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Get exclusive access to thought-provoking articles, bonus podcast content, and cutting-edge whitepapers. Become a member of the UX Magazine community today!

Wear Are We?

by UX Magazine Staff
5 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

In the wake of a new implementation of brain-computer interface headgear, we asked three UX Magazine contributors about the trajectory of wearable technology.

Yesterday, Philips, Accenture, and Emotiv announced a proof-of-concept project involving a piece of headgear and an app that allows users to control Philips products using their brainwaves. The intent is to give patients with Lou Gehrig’s disease (ALS) and other neurological diseases greater independence—letting them make a call, send a pre-set text or email, or turn lights and television sets on and off.

Service design agency Fjord (part of Accenture Interactive) collaborated on the project, researching the functionality of heads up displays (HUD) along with the the Emotiv Insight, a wireless electroencephalography (EEG) headset that detects brain commands, micro-facial expressions, and emotional states.

According to Fjord, “After these sessions, it became clear that using Emotiv’s machine learning algorithms, users had the potential to train and improve their focus to complete actions faster through brain commands. Helping people complete tasks with ease and speed became a guiding principle for the proof of concept research project.”

 

Earlier this month, former New Orleans Saints safety Steve Gleason, who suffers from ALS, told a team of hackers at Microsoft: “Until there is a medical treatment or cure, technology can be a cure.” Gleason was featured prominently in a Windows ad during Super Bowl XLVIII using the Tobii EyeMobile, which allows him to control a tablet using just his gaze. This addition of workable brain-computer interface presents a pretty big leap forward for those afflicted with ALS but also for the potential of wearable technology in general.

“What works for people with limited mobility will also work for people who have full mobility but still could benefit from the right input options for a given context where they can’t type or speak,” says UX Magazine contributor Hunter Whitney. “The main concern with these technologies is that, while they continue to improve, they are often not all that great in terms of responsiveness.”

Having used this type of technology himself, Whitney notes that depending on aspects like the number of channels on the headset and signal processing capabilities, the interaction can be like trying to order something over a phone with a bad connection—something he addresses in his article, “Brain-Computer Interfaces: Interactions at the Speed of Thought.”

“EEGs are fairly easy to use for sensing brain activity, but they do have drawbacks. Imagine there’s a raucous party and you are standing in an adjoining room. If you put your ear up to the wall at different places, you may be able to discern a few clear patterns in all the noise. For example, you might know what kind of music is playing and even get a sense of some conversations. If you put a cup against the wall, you could get a clearer sense of what you are hearing. Similarly, our brains are filled with a cacophony of electrical signals, which are a fundamental part of how our brain cells communicate with each other.”

It’s interesting that this kind of technology is finally poking into the mainstream just as wearable technology is getting its toehold. While we might most readily associate “wearable” with interactive screens strapped to our wrists that tell us when to stop eating ice cream, something sleek that sits on a user’s head and reads brainwaves points to the emergence of an even deeper connection between humans and technology—one that could easily lead embedded technology.

Emotive Insight

“I don’t have any hard data to back this up, but there seems to be a big difference in perception and expectations between medical and consumer wearables,” says contributor Thomas Wendt. “Within the medical space, introducing new devices to improve quality of life doesn’t seem like a stretch … Right now, it’s the head band and its connected devices; but some time soon it might be an embedded technology. The question becomes whether embedding hardware into a patient’s brain to improve function is ethically different than providing a home health care worker.”

Wendt notes that we’ve already been implanting devices for years—pointing to the cochlear implant for the hearing impaired as the most obvious example. He says that while cochlear implantation has its opponents, it is mostly accepted as a means of restoring hearing. The big question is how we determine where there is a “lack.”

“A 30-year-old who loses an arm in a car accident feels a massive lack; something is missing that was once so mundane but important. Most of the opposition [to cochlear implants] comes from the deaf community. Many deaf people feel that deafness is not an impairment, and have built a culture around its language and norms. They view cochlear implants as denying one’s deafness and deaf culture, turning toward a ‘quick fix’ to amend what was not broken in the first place.”

Wearables also call out the difference between designing objects and designing the self

Ideas and arguments surrounding the cultural acceptance of technology are part of the fabric of experience design. After all, smartphones didn’t become ubiquitous until Apple made them so easy to use and understand that they appealed to everyone.

“What is the thing about our culture (society changing, not technology) that brings implantable from special case—morally ambiguous or otherwise—to available, common, and everyday?” asks contributor Steven Hoober. “I don’t know, and for many technologies I’d say ‘Who knows, maybe nothing?’ But, since implantable technology exists, I think something will, someday. Maybe it’s a really, really successful program along [the lines of what Philips is doing] that massively improves the lives of tens of thousands of people.”

Hoober says that while embedded technology is basically off the table at present, the exception comes in the form of medical devices (also referencing embedded hearing aids). “It’s not so much about the technology, but when, all of a sudden, things go from impossible (or immoral) to ubiquitous.”

As Hoober wrote in a recent article for UX Matters (“The Social History of the Smartphone”), “While digital natives are comfortable with technology, the question is: which technology, in which context? There are now more mobile phones on Earth than there are people! And most of these phones have cameras. Yet Google Glass feels invasive because of its ability to record video.”

While context is a huge differentiator, Wendt points out that, for experience designers, wearables also call out the difference between designing objects and designing the self.

“When we design a coffee maker, we think shape, context of use, etc. When we design wearables, the intimacy level is so much higher that we cannot avoid considering how these devices literally change who we are and our bodily engagement with the world … One buys a Fitbit because they desire to be seen as fitness-conscious, just as much as they seek truth in quantification. Their exercise routine or daily walks are an act of designing a better self, so the device simply becomes part of that ecosystem.”

What that ecosystem and the devices that inhabit it will look like 20, 10, or even five years from now is anyone’s guess.

Where do you see wearable tech taking us as users and experience designers in the near- and long-term? We’d love to hear from you in the comments section below.

post authorUX Magazine Staff

UX Magazine Staff
UX Magazine was created to be a central, one-stop resource for everything related to user experience. Our primary goal is to provide a steady stream of current, informative, and credible information about UX and related fields to enhance the professional and creative lives of UX practitioners and those exploring the field. Our content is driven and created by an impressive roster of experienced professionals who work in all areas of UX and cover the field from diverse angles and perspectives.

Tweet
Share
Post
Share
Email
Print

Related Articles

Discover the hidden costs of AI-driven connectivity, from environmental impacts to privacy risks. Explore how our increasing reliance on AI is reshaping personal relationships and raising ethical challenges in the digital age.

Article by Louis Byrd
The Hidden Cost of Being Connected in the Age of AI
  • The article discusses the hidden costs of AI-driven connectivity, focusing on its environmental and energy demands.
  • It examines how increased connectivity exposes users to privacy risks and weakens personal relationships.
  • The article also highlights the need for ethical considerations to ensure responsible AI development and usage.
Share:The Hidden Cost of Being Connected in the Age of AI
9 min read

The role of the Head of Design is transforming. Dive into how modern design leaders amplify impact, foster innovation, and shape strategic culture, redefining what it means to lead design today.

Article by Darren Smith
Head of Design is Dead, Long Live the Head of Design!
  • The article examines the evolving role of the Head of Design, highlighting shifts in expectations, responsibilities, and leadership impact within design teams.
  • It discusses how design leaders amplify team performance, foster innovation, and align design initiatives with broader business goals, especially under changing demands in leadership roles.
  • The piece emphasizes the critical value of design leadership as a multiplier for organizational success, offering insights into the unique contributions that design leaders bring to strategy, culture, and team cohesion.
Share:Head of Design is Dead, Long Live the Head of Design!
9 min read

Discover how digital twins are transforming industries by enabling innovation and reducing waste. This article delves into the power of digital twins to create virtual replicas, allowing companies to improve products, processes, and sustainability efforts before physical resources are used. Read on to see how this cutting-edge technology helps streamline operations and drive smarter, eco-friendly decisions

Article by Alla Slesarenko
How Digital Twins Drive Innovation and Minimize Waste
  • The article explores how digital twins—virtual models of physical objects—enable organizations to drive innovation by allowing testing and improvements before physical implementation.
  • It discusses how digital twins can minimize waste and increase efficiency by identifying potential issues early, ultimately optimizing resource use.
  • The piece emphasizes the role of digital twins in various sectors, showcasing their capacity to improve processes, product development, and sustainability initiatives.
Share:How Digital Twins Drive Innovation and Minimize Waste
5 min read

Join the UX Magazine community!

Stay informed with exclusive content on the intersection of UX, AI agents, and agentic automation—essential reading for future-focused professionals.

Hello!

You're officially a member of the UX Magazine Community.
We're excited to have you with us!

Thank you!

To begin viewing member content, please verify your email.

Tell us about you. Enroll in the course.

    This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and