The Community Of Over 578,000

Wear Are We?

by UX Magazine Stuff
Share this post on
Share on twitter
Tweet
Share on linkedin
Share
Share on facebook
Post
Share on reddit
Share
Share on email
Email
Share on print
Print

Save

In the wake of a new implementation of brain-computer interface headgear, we asked three UX Magazine contributors about the trajectory of wearable technology.

Yesterday, Philips, Accenture, and Emotiv announced a proof-of-concept project involving a piece of headgear and an app that allows users to control Philips products using their brainwaves. The intent is to give patients with Lou Gehrig’s disease (ALS) and other neurological diseases greater independence—letting them make a call, send a pre-set text or email, or turn lights and television sets on and off.

Service design agency Fjord (part of Accenture Interactive) collaborated on the project, researching the functionality of heads up displays (HUD) along with the the Emotiv Insight, a wireless electroencephalography (EEG) headset that detects brain commands, micro-facial expressions, and emotional states.

According to Fjord, “After these sessions, it became clear that using Emotiv’s machine learning algorithms, users had the potential to train and improve their focus to complete actions faster through brain commands. Helping people complete tasks with ease and speed became a guiding principle for the proof of concept research project.”

 

Earlier this month, former New Orleans Saints safety Steve Gleason, who suffers from ALS, told a team of hackers at Microsoft: “Until there is a medical treatment or cure, technology can be a cure.” Gleason was featured prominently in a Windows ad during Super Bowl XLVIII using the Tobii EyeMobile, which allows him to control a tablet using just his gaze. This addition of workable brain-computer interface presents a pretty big leap forward for those afflicted with ALS but also for the potential of wearable technology in general.

“What works for people with limited mobility will also work for people who have full mobility but still could benefit from the right input options for a given context where they can’t type or speak,” says UX Magazine contributor Hunter Whitney. “The main concern with these technologies is that, while they continue to improve, they are often not all that great in terms of responsiveness.”

Having used this type of technology himself, Whitney notes that depending on aspects like the number of channels on the headset and signal processing capabilities, the interaction can be like trying to order something over a phone with a bad connection—something he addresses in his article, “Brain-Computer Interfaces: Interactions at the Speed of Thought.”

“EEGs are fairly easy to use for sensing brain activity, but they do have drawbacks. Imagine there’s a raucous party and you are standing in an adjoining room. If you put your ear up to the wall at different places, you may be able to discern a few clear patterns in all the noise. For example, you might know what kind of music is playing and even get a sense of some conversations. If you put a cup against the wall, you could get a clearer sense of what you are hearing. Similarly, our brains are filled with a cacophony of electrical signals, which are a fundamental part of how our brain cells communicate with each other.”

It’s interesting that this kind of technology is finally poking into the mainstream just as wearable technology is getting its toehold. While we might most readily associate “wearable” with interactive screens strapped to our wrists that tell us when to stop eating ice cream, something sleek that sits on a user’s head and reads brainwaves points to the emergence of an even deeper connection between humans and technology—one that could easily lead embedded technology.

Emotive Insight

“I don’t have any hard data to back this up, but there seems to be a big difference in perception and expectations between medical and consumer wearables,” says contributor Thomas Wendt. “Within the medical space, introducing new devices to improve quality of life doesn’t seem like a stretch … Right now, it’s the head band and its connected devices; but some time soon it might be an embedded technology. The question becomes whether embedding hardware into a patient’s brain to improve function is ethically different than providing a home health care worker.”

Wendt notes that we’ve already been implanting devices for years—pointing to the cochlear implant for the hearing impaired as the most obvious example. He says that while cochlear implantation has its opponents, it is mostly accepted as a means of restoring hearing. The big question is how we determine where there is a “lack.”

“A 30-year-old who loses an arm in a car accident feels a massive lack; something is missing that was once so mundane but important. Most of the opposition [to cochlear implants] comes from the deaf community. Many deaf people feel that deafness is not an impairment, and have built a culture around its language and norms. They view cochlear implants as denying one’s deafness and deaf culture, turning toward a ‘quick fix’ to amend what was not broken in the first place.”

Wearables also call out the difference between designing objects and designing the self

Ideas and arguments surrounding the cultural acceptance of technology are part of the fabric of experience design. After all, smartphones didn’t become ubiquitous until Apple made them so easy to use and understand that they appealed to everyone.

“What is the thing about our culture (society changing, not technology) that brings implantable from special case—morally ambiguous or otherwise—to available, common, and everyday?” asks contributor Steven Hoober. “I don’t know, and for many technologies I’d say ‘Who knows, maybe nothing?’ But, since implantable technology exists, I think something will, someday. Maybe it’s a really, really successful program along [the lines of what Philips is doing] that massively improves the lives of tens of thousands of people.”

Hoober says that while embedded technology is basically off the table at present, the exception comes in the form of medical devices (also referencing embedded hearing aids). “It’s not so much about the technology, but when, all of a sudden, things go from impossible (or immoral) to ubiquitous.”

As Hoober wrote in a recent article for UX Matters (“The Social History of the Smartphone”), “While digital natives are comfortable with technology, the question is: which technology, in which context? There are now more mobile phones on Earth than there are people! And most of these phones have cameras. Yet Google Glass feels invasive because of its ability to record video.”

While context is a huge differentiator, Wendt points out that, for experience designers, wearables also call out the difference between designing objects and designing the self.

“When we design a coffee maker, we think shape, context of use, etc. When we design wearables, the intimacy level is so much higher that we cannot avoid considering how these devices literally change who we are and our bodily engagement with the world … One buys a Fitbit because they desire to be seen as fitness-conscious, just as much as they seek truth in quantification. Their exercise routine or daily walks are an act of designing a better self, so the device simply becomes part of that ecosystem.”

What that ecosystem and the devices that inhabit it will look like 20, 10, or even five years from now is anyone’s guess.

Where do you see wearable tech taking us as users and experience designers in the near- and long-term? We’d love to hear from you in the comments section below.

post authorUX Magazine Stuff

UX Magazine Stuff, UX Magazine was created to be a central, one-stop resource for everything related to user experience. Our primary goal is to provide a steady stream of current, informative, and credible information about UX and related fields to enhance the professional and creative lives of UX practitioners and those exploring the field. Our content is driven and created by an impressive roster of experienced professionals who work in all areas of UX and cover the field from diverse angles and perspectives.

Share on twitter
Tweet
Share on linkedin
Share
Share on facebook
Post
Share on reddit
Share
Share on email
Email
Share on print
Print

Related Articles

Clear processes for secondary research are rare. Some ideas on how to improve the way designers collect and document existing knowledge.

UX Case Study: Encouraging Secondary Research in Design
  • Teisanu Tudor started an exploration of around 47 designers, managers, and researchers in order to figure out how they could enrich secondary research.
  • There are a few designers author talked to who run exploratory research studies before sketching out any ideas.
  • Before diving into visual work, there are plenty of sources that could (and should) provide crucial insights for building a foundation.
  • Equipped with feedback from that 47 people, Teisanu Tudor explores the idea of designing a refined aggregator to help product designers obtain a diverse and comprehensive range of data while conducting the secondary research.
Share:UX Case Study: Encouraging Secondary Research in Design

The road to a good customer experience can be full of potholes. How do you navigate such a treacherous path? One key way is through product thinking.

What is product thinking and why does it matter?
  • Product thinking is key to shaping the best customer experience possible as it helps to identify problems and solve them.
  • By providing a holistic perspective on a product it differs from design thinking and reveals the real product value for customers.
  • Because of its strategic importance, every team member should hone product thinking skills. It’s more than a working framework, it’s a mindset, a culture of customer experience.
Share:What is product thinking and why does it matter?

This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and