Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Get exclusive access to thought-provoking articles, bonus podcast content, and cutting-edge whitepapers. Become a member of the UX Magazine community today!

Wear Are We?

by UX Magazine Staff
5 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

In the wake of a new implementation of brain-computer interface headgear, we asked three UX Magazine contributors about the trajectory of wearable technology.

Yesterday, Philips, Accenture, and Emotiv announced a proof-of-concept project involving a piece of headgear and an app that allows users to control Philips products using their brainwaves. The intent is to give patients with Lou Gehrig’s disease (ALS) and other neurological diseases greater independence—letting them make a call, send a pre-set text or email, or turn lights and television sets on and off.

Service design agency Fjord (part of Accenture Interactive) collaborated on the project, researching the functionality of heads up displays (HUD) along with the the Emotiv Insight, a wireless electroencephalography (EEG) headset that detects brain commands, micro-facial expressions, and emotional states.

According to Fjord, “After these sessions, it became clear that using Emotiv’s machine learning algorithms, users had the potential to train and improve their focus to complete actions faster through brain commands. Helping people complete tasks with ease and speed became a guiding principle for the proof of concept research project.”

 

Earlier this month, former New Orleans Saints safety Steve Gleason, who suffers from ALS, told a team of hackers at Microsoft: “Until there is a medical treatment or cure, technology can be a cure.” Gleason was featured prominently in a Windows ad during Super Bowl XLVIII using the Tobii EyeMobile, which allows him to control a tablet using just his gaze. This addition of workable brain-computer interface presents a pretty big leap forward for those afflicted with ALS but also for the potential of wearable technology in general.

“What works for people with limited mobility will also work for people who have full mobility but still could benefit from the right input options for a given context where they can’t type or speak,” says UX Magazine contributor Hunter Whitney. “The main concern with these technologies is that, while they continue to improve, they are often not all that great in terms of responsiveness.”

Having used this type of technology himself, Whitney notes that depending on aspects like the number of channels on the headset and signal processing capabilities, the interaction can be like trying to order something over a phone with a bad connection—something he addresses in his article, “Brain-Computer Interfaces: Interactions at the Speed of Thought.”

“EEGs are fairly easy to use for sensing brain activity, but they do have drawbacks. Imagine there’s a raucous party and you are standing in an adjoining room. If you put your ear up to the wall at different places, you may be able to discern a few clear patterns in all the noise. For example, you might know what kind of music is playing and even get a sense of some conversations. If you put a cup against the wall, you could get a clearer sense of what you are hearing. Similarly, our brains are filled with a cacophony of electrical signals, which are a fundamental part of how our brain cells communicate with each other.”

It’s interesting that this kind of technology is finally poking into the mainstream just as wearable technology is getting its toehold. While we might most readily associate “wearable” with interactive screens strapped to our wrists that tell us when to stop eating ice cream, something sleek that sits on a user’s head and reads brainwaves points to the emergence of an even deeper connection between humans and technology—one that could easily lead embedded technology.

Emotive Insight

“I don’t have any hard data to back this up, but there seems to be a big difference in perception and expectations between medical and consumer wearables,” says contributor Thomas Wendt. “Within the medical space, introducing new devices to improve quality of life doesn’t seem like a stretch … Right now, it’s the head band and its connected devices; but some time soon it might be an embedded technology. The question becomes whether embedding hardware into a patient’s brain to improve function is ethically different than providing a home health care worker.”

Wendt notes that we’ve already been implanting devices for years—pointing to the cochlear implant for the hearing impaired as the most obvious example. He says that while cochlear implantation has its opponents, it is mostly accepted as a means of restoring hearing. The big question is how we determine where there is a “lack.”

“A 30-year-old who loses an arm in a car accident feels a massive lack; something is missing that was once so mundane but important. Most of the opposition [to cochlear implants] comes from the deaf community. Many deaf people feel that deafness is not an impairment, and have built a culture around its language and norms. They view cochlear implants as denying one’s deafness and deaf culture, turning toward a ‘quick fix’ to amend what was not broken in the first place.”

Wearables also call out the difference between designing objects and designing the self

Ideas and arguments surrounding the cultural acceptance of technology are part of the fabric of experience design. After all, smartphones didn’t become ubiquitous until Apple made them so easy to use and understand that they appealed to everyone.

“What is the thing about our culture (society changing, not technology) that brings implantable from special case—morally ambiguous or otherwise—to available, common, and everyday?” asks contributor Steven Hoober. “I don’t know, and for many technologies I’d say ‘Who knows, maybe nothing?’ But, since implantable technology exists, I think something will, someday. Maybe it’s a really, really successful program along [the lines of what Philips is doing] that massively improves the lives of tens of thousands of people.”

Hoober says that while embedded technology is basically off the table at present, the exception comes in the form of medical devices (also referencing embedded hearing aids). “It’s not so much about the technology, but when, all of a sudden, things go from impossible (or immoral) to ubiquitous.”

As Hoober wrote in a recent article for UX Matters (“The Social History of the Smartphone”), “While digital natives are comfortable with technology, the question is: which technology, in which context? There are now more mobile phones on Earth than there are people! And most of these phones have cameras. Yet Google Glass feels invasive because of its ability to record video.”

While context is a huge differentiator, Wendt points out that, for experience designers, wearables also call out the difference between designing objects and designing the self.

“When we design a coffee maker, we think shape, context of use, etc. When we design wearables, the intimacy level is so much higher that we cannot avoid considering how these devices literally change who we are and our bodily engagement with the world … One buys a Fitbit because they desire to be seen as fitness-conscious, just as much as they seek truth in quantification. Their exercise routine or daily walks are an act of designing a better self, so the device simply becomes part of that ecosystem.”

What that ecosystem and the devices that inhabit it will look like 20, 10, or even five years from now is anyone’s guess.

Where do you see wearable tech taking us as users and experience designers in the near- and long-term? We’d love to hear from you in the comments section below.

post authorUX Magazine Staff

UX Magazine Staff
UX Magazine was created to be a central, one-stop resource for everything related to user experience. Our primary goal is to provide a steady stream of current, informative, and credible information about UX and related fields to enhance the professional and creative lives of UX practitioners and those exploring the field. Our content is driven and created by an impressive roster of experienced professionals who work in all areas of UX and cover the field from diverse angles and perspectives.

Tweet
Share
Post
Share
Email
Print

Related Articles

What if your productivity app could keep you as focused as your favorite game? This article explores how game design psychology can transform everyday tools into experiences that spark flow, focus, and real engagement.

Article by Montgomery Singman
Flow State Design: Applying Game Psychology to Productivity Apps
  • The article shows how principles from game design can help productivity tools create and sustain a flow state.
  • It explains that games succeed by balancing challenge and skill, providing clear goals, and offering immediate feedback — elements most productivity apps lack.
  • The piece argues that applying these psychological insights could make work tools more engaging, adaptive, and motivating.
Share:Flow State Design: Applying Game Psychology to Productivity Apps
12 min read

Learn how understanding user emotions can create intuitive, supportive designs that build trust and loyalty.

Article by Pavel Bukengolts
The Role of Emotion in UX: Embracing Emotionally Intelligent Design
  • The article emphasizes that emotionally intelligent design is key to creating meaningful UX that satisfies users and drives business success.
  • It shows how understanding users’ emotions — through research, empathy mapping, journey mapping, and service blueprinting — can reveal hidden needs and shape more intuitive, reassuring digital experiences.
  • The piece argues that embedding empathy and emotional insights into design strengthens user engagement, loyalty, and overall satisfaction.
Share:The Role of Emotion in UX: Embracing Emotionally Intelligent Design
5 min read

As AI takes on more of the solution work, the real craft of design shifts to how we frame the problem. This piece explores why staying with uncertainty and resisting the urge to rush to answers may be a designer’s most powerful skill.

Article by Morteza Pourmohamadi
The Frame, the Illusion, and the Brief
  • The article highlights that as AI takes over more of the solution work, the designer’s true craft lies in framing the problem rather than rushing to solve it.
  • It shows how cognitive biases like the need for closure or action bias can distort our perception, making careful problem framing essential for clarity and creativity.
  • The piece argues that framing is itself a design act — a practice of staying with uncertainty long enough to cultivate shared understanding and more meaningful outcomes.
Share:The Frame, the Illusion, and the Brief
3 min read

Join the UX Magazine community!

Stay informed with exclusive content on the intersection of UX, AI agents, and agentic automation—essential reading for future-focused professionals.

Hello!

You're officially a member of the UX Magazine Community.
We're excited to have you with us!

Thank you!

To begin viewing member content, please verify your email.

Tell us about you. Enroll in the course.

    This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and