Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Get exclusive access to thought-provoking articles, bonus podcast content, and cutting-edge whitepapers. Become a member of the UX Magazine community today!

Home ›› Analytics and Tracking ›› Algorithms and the (Incomplete) Stories They Tell Us

Algorithms and the (Incomplete) Stories They Tell Us

by Jason Goodhand
5 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

The stories algorithms provide must be designed to create true value for people, or we miss the great potential of IoT and the data it provides.

“The value [of products and services] will increasingly come from being great at reading the tea leaves in the data.” – Randy Komisar, Partner at Kleiner, Perkins, Caufield & Byers

The unspoken pact that we have with machines is that their algorithms will be able to “read the tea leaves” in our data. Algorithms will be able to tell us a story about ourselves in a new way, or help us simplify our lives by being more informed.

Take, for instance, the motion sensor in the Fitbit. It listens to our movements, and its algorithms tell us a detailed story of our exercise and sleep – both quantity and quality – and where we need to improve.

Yet in spite of the ever-increasing sophistication of algorithms and machine learning, the stories that machines tell us don’t always make us more aware of our behavior, keep us better informed about ourselves, or help us make better choices.

There are three ways algorithms under-deliver on the value they promise:

  • Although sensors are becoming ubiquitous, there are always gaps in the data, resulting in stories with thin or no data to back them up.
  • Algorithms are designed to optimize around specific metrics and sometimes they optimize to a fault, no longer providing value to people.
  • The intelligence behind an algorithm and its recommendations may not be transparent to users, making them difficult to trust.

Below are three examples illustrating the three ways in which algorithms under-deliver.

1. Gaps in the data with Apple Watch

In recent years, the consumer health tech industry has exploded with “wearables” and apps that track our sleep activity, our exercise, our heart rates and more. Algorithms look at that data and offer recommendations on how to improve our health. However, there are limitations.

Algorithms provide enormous benefits to users but sometimes optimize to a fault

For example, the Apple Watch tells its user, “You got 5 minutes of exercise, but you should be getting 30 minutes.” But how does it know I didn’t take my watch off and go for an hour long swim? Or that it hasn’t been tracking my movement because I simply forgot to charge my watch last night? The answer is, it doesn’t.

There are always gaps in the data, and this is only compounded when there is no way for humans to manually correct the data. Because of this, we can’t accept the stories algorithms tell as the full story.

2. Optimizing to a fault with Facebook

The Facebook feed is the lens through which users see the stories of their world of connections. By clicking, “liking” and commenting on posts in their Facebook feed, users signal to the social network’s algorithm that they care about a particular story or message. That in turn helps influence what the algorithm shows in their feed later.

But the algorithm has its own selfish goal as well – to deliver as many promoted articles and ad views as it can to your feed. So what happens if a user liked everything he saw on Facebook for two days?  Mat Honen, Tech writer for WIRED, did exactly that, and the result was very telling. His News Feed took on an entirely new character – devoid of any sign of friends and family. His feed became about brands and messaging, rather than real people with messages about their lives.

In more extreme cases, an algorithm can optimize and cause damage at a staggering scale, such as the 2010 Flash Crash where algorithms caused a trillion-dollar stock market crash in a matter of 36 minutes.

While algorithms will continue to provide enormous benefits to users, especially as they become more sophisticated, it’s important to keep in mind that they are designed to optimize around specific metrics and sometimes they optimize to a fault, no longer providing value to people.

3. Lack of transparency with UPS

UPS reportedly spent 10 years developing the Orion algorithm to give its drivers the most efficient route to take to complete their daily deliveries. The algorithm saves a dollar or two here and there, but when scaled to UPS’ more than 55,000 daily delivery routes, the savings can be huge.

But according to a WSJ article, “Driver reaction to Orion is mixed. The experience can be frustrating for some who might not want to give up a degree of autonomy, or who might not follow Orion’s logic. For example, some drivers don’t understand why it makes sense to deliver a package in one neighborhood in the morning, and come back to the same area later in the day for another delivery. But Orion often can see a payoff, measured in small amounts of time and money that the average person might not see.”

The article continues “One driver, who declined to speak for attribution, said he has been on Orion since mid-2014 and dislikes it, because it strikes him as illogical.”

Economist Alex Tabarrok wrote in “The Rise of Opaque Intelligence” that “the problem isn’t artificial intelligence but opaque intelligence. Algorithms have now become so sophisticated that we humans can’t really understand why they are telling us what they are telling us.”

Algorithms make recommendations to us in many areas of our life, but a lack of transparency in how these recommendations are generated can cause mistrust; for instance, knowing that Netflix movie recommendation is related to the fact that I watched “Big Trouble in Little China” last week makes the recommendation more meaningful. It also allows me to make better choices about what I want to watch next.

Keeping algorithms user-centered

As algorithms and the stories they tell us become more prevalent in our lives, it’s important to recognize those stories may be based on noisy data, be over-optimized, or just seem illogical and therefore are ignored. The stories must be designed to create true value for people, or we miss the great potential of IoT and the data it provides.

Product designers and data scientists should work to improve these algorithms together with users so that they can make meaningful contributions to people’s quality of life.

Image of formula mathematics equation courtesy of Shutterstock.

post authorJason Goodhand

Jason Goodhand
This user does not have bio yet.

Tweet
Share
Post
Share
Email
Print

Related Articles

Discover the hidden costs of AI-driven connectivity, from environmental impacts to privacy risks. Explore how our increasing reliance on AI is reshaping personal relationships and raising ethical challenges in the digital age.

Article by Louis Byrd
The Hidden Cost of Being Connected in the Age of AI
  • The article discusses the hidden costs of AI-driven connectivity, focusing on its environmental and energy demands.
  • It examines how increased connectivity exposes users to privacy risks and weakens personal relationships.
  • The article also highlights the need for ethical considerations to ensure responsible AI development and usage.
Share:The Hidden Cost of Being Connected in the Age of AI
9 min read

Is AI reshaping creativity as we know it? This thought-provoking article delves into the rise of artificial intelligence in various creative fields, exploring its impact on innovation and the essence of human artistry. Discover whether AI is a collaborator or a competitor in the creative landscape.

Article by Oliver Inderwildi
The Ascent of AI: Is It Already Shaping Every Breakthrough and Even Taking Over Creativity?
  • The article explores the transformative impact of AI on creativity, questioning whether it is enhancing or overshadowing human ingenuity.
  • It discusses the implications of AI-generated content across various fields, including art, music, and writing, and its potential to redefine traditional creative processes.
  • The piece emphasizes the need for a balanced approach that values human creativity while leveraging AI’s capabilities, advocating for a collaborative rather than competitive relationship between the two.
Share:The Ascent of AI: Is It Already Shaping Every Breakthrough and Even Taking Over Creativity?
6 min read

Discover how GPT Researcher is transforming the research landscape by using multiple AI agents to deliver deeper, unbiased insights. With Tavily, this approach aims to redefine how we search for and interpret information.

Article by Assaf Elovic
You Are Doing Research Wrong
  • The article introduces GPT Researcher, an AI tool that uses multiple specialized agents to enhance research depth and accuracy beyond traditional search engines.
  • It explores how GPT Researcher’s agentic approach reduces bias by simulating a collaborative research process, focusing on factual, well-rounded responses.
  • The piece presents Tavily, a search engine aligned with GPT Researcher’s framework, aimed at delivering transparent and objective search results.
Share:You Are Doing Research Wrong
6 min read

Join the UX Magazine community!

Stay informed with exclusive content on the intersection of UX, AI agents, and agentic automation—essential reading for future-focused professionals.

Hello!

You're officially a member of the UX Magazine Community.
We're excited to have you with us!

Thank you!

To begin viewing member content, please verify your email.

Tell us about you. Enroll in the course.

    This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and