Online privacy is a hassle. People don’t really want to have to manage their digital trails as they go about daily activities, however important it may be. People aren’t excited about managing their privacy online; they’re excited about seeing photos from friends, researching something interesting, and getting things done. As a result, people aren’t clear on what really happens with their data online and become upset when things aren’t as private as they expected, and this is when problems arise for online services.
However, privacy can be turned into a positive. Managed properly, privacy can actually engender trust and a deeper bond with a brand. Importantly, this bond can easily trump factors like ‘lowest cost’ or ‘fastest shipping’, leading to improved customer retention and profitability.
We’re at a turning point for online privacy. If we do privacy right, right now, there is still a lot of opportunity for companies and their customers to benefit from it. Get it wrong, however, and we’ll lose trust from users, and lose the opportunity to build great products based upon user behaviors and data.
The Negative Side of Privacy
The concept of privacy tends to have negative connotations: preventing information theft, avoiding embarrassment, escaping personal and professional ruin. Whether it’s a personal struggle with weight or a popular movement to overthrow a government, privacy taps into our deepest fears and worries.
Correspondingly, our research has found that when people think about digital privacy, they tend to focus on the implications of a loss of privacy:
- Losing control over personal information, identity, and social graphs
- Not being able to recover from accidental exposure of sensitive information
- Tampering, misappropriation, and misuse of personal data
- What personal information can be compromised or exposed
- Fears about how others will perceive them
- Fairness and discrimination, ranging from over-personalization to pricing based on demographics
- Embarrassment and persecution
This emotional reaction to privacy concerns is helping fuel the buzz that surrounds online security issues. People are afraid of the ever-shifting sands of Facebook privacy settings, such as the recent buzz (and thousands of panicked wall posts) about automatic face recognition and tagging in friends’ photos. Incidents like the recent accidental exposure of Dropbox data can cause people to hold back, and think twice about using online services. And some startups are beginning to react, leveraging privacy concerns to their own benefit. The DuckDuckGo search engine, for example, promises to always provide users with anonymous and unfiltered search results, uninfluenced by profiling based upon personal data.
Online Privacy Is Work
Many websites and mobile apps provide some level of control over online privacy. However, people using websites or apps aren’t there to spend time configuring their privacy settings; they want to communicate, have fun, find stuff, and get things done.
Maintaining privacy is an annoyance, something to “deal with,” and not something fun. It’s like cleaning out the closet, balancing the checkbook—all things that you should do, but nothing that you want to do. Managing privacy settings doesn’t have direct perceived benefits, and our research shows that most people don’t proactively go looking to change their settings.
Even when people do care about privacy, taking time to read and understand privacy policies and then make decisions about them takes effort. People don’t want to have to assess which companies they can trust, and which they can’t.
Online Privacy Isn’t Tangible
When it’s working properly, privacy is invisible. Like electricity, people don’t care about it until it’s gone. Online privacy isn’t something that people think about every day, monitor, or spend time configuring. Rather, it’s something that only surfaces when something unexpected happens:
- Finding that search terms from one website influence the ads shown on another website, and then wondering what other information is flowing, and to where.
- Not realizing that one’s own Amazon wish list is publicly accessible.
- Finding out that one’s view of the New York Times homepage is different from that of another person.
In the physical world, privacy is tangible. It’s relatively easy to understand what is private, what isn’t, and whether or not something will stay private over time. Physical affordances let us understand the true reach of our privacy, whether it’s locking a door, handing over a handwritten note, or concealing things based on our context (e.g., not showing one’s wallet or jewelry when in an unfamiliar city).
In the digital world, however, privacy is intangible. People don’t understand what is truly private and what isn’t, nor the scope of their decisions online. They don’t understand what is permanent and what can change over time, whether it’s a party snapshot shared with friends or ongoing searches for help with depression. And people’s feelings about what is private is often influenced by unexpected factors. For instance, Google recently placed a Father’s Day blurb on their homepage and within Gmail, exhorting people to “Call your Father”. However, while people found the homepage blurb to be fanciful and friendly, the Gmail blurb felt intrusive because it was in their “personal space” online.
Expectations of Privacy
People have underlying expectations around what is private online, and this is what drives their behaviors—far more than anything they might find in privacy policies or in sifting through preference settings. Sometimes people consciously think about how they expect their privacy to be handled online, but often they simply go on intuition built upon life experiences and cultural background.
“When I log in, it knows who I am, but nobody else can see it”
“When I am on Google, I am anonymous”
“Companies don’t share or sell my data”
This hazy approach to privacy drives behavior as well. People will click through privacy dialogs and warnings without internalizing the rules because they assume that the website will do the right thing with their data—however they personally define “doing the right thing.” However, people’s expectations are often out of alignment with what is really being done with their data, and they don’t understand that certain information may become publicly accessible, remixed, or sold to third parties. For instance, when uploading a photo to Flickr as “publicly accessible,” some are surprised to see that it is also featured on the “Recent Uploads” page on Flickr. And at other times, these rules are changing behind the scenes, without reaching the conscious awareness of the user.
This makes it especially difficult to ensure that people feel comfortable with how a website or app manages their private information. When a perceived breach of privacy violates people’s sense of what is right, emotions will drive their reactions, not the privacy policy.
The Positive Side of Privacy
As online privacy fears continue to grow in the collective consciousness, they cause people to not take full advantage of the digital world. People stop using features and functions, and avoid providing information such as their location, photos, or preferences. People react explosively when they feel that their privacy has been breached, leading to bad publicity for websites or mobile apps, whether warranted or not.
But there is a bright side to all of this. Treating privacy properly can turn it into a positive rather than a negative. When people entrust a website or mobile app with their data, they can build trust in the service. When this trust is respected by the service, the resulting bond can become stronger than purely financial or logistical considerations.
“I’ll pay $10 more on Amazon instead of the other site because I know Amazon will do the right thing if I have to return it.”
“I know that my insurance agent won’t let them mismanage my online data”
This can result in a transformation of privacy concerns into the foundation for confidence and trust. By properly handling privacy, companies can create a more positive dialog with the user, which can create engagement, build brand equity, foster an emotional connection, and establish deep-seated trust in the service. And, over time, this opens opportunities for the creation of new and more personalized features, functions, offerings, and businesses.
Next in the series: How to design for trust, and results from our research into trust and privacy for mobile devices.