Consumers are a demanding crowd. They increasingly insist on personalized online services, but they loathe giving up their privacy to get it. Digital Catapult discovered that among more than 4,000 consumers in the U.K., 60 percent were “uncomfortable sharing personal data.” Another 14 percent resisted sharing personal data altogether. The primary fear driving this reluctance? A loss of control over how and with whom their information is shared.
Some people have accepted that their increased reliance on the internet means sacrificing privacy. More than 60 percent of Americans aren’t confident that social media sites, search engines, video platforms, and online advertisers will keep their personal data and activities secure.
Fortunately, privacy and personalization can go hand in hand. By developing their user experiences around improved security, companies can deliver systems that feel customized and simple while maintaining privacy and security.
Going on the Security Offensive
Until the past few years, the data security industry has been reactive. Companies relied on network firewalls to protect sensitive information. Every few months, a breach would occur somewhere. Microsoft and other corporations that provide security infrastructures would issue a patch, and businesses would implement them in the hopes the patches would hold.
But high-profile data leaks, such as the Sony hack in 2014, proved that behaving reactively isn’t enough. Even basic attacks cost companies roughly $7 million and enable hackers to steal valuable data. Simply patching these holes as they appear is a losing game.
The Internet of Things is bringing more security risks to the public every day. A few years ago, most people used two or three devices to connect to the internet. Now, it’s not just their phones, laptops, and tablets that are wired. Common appliances in office spaces and homes are increasingly plugged into the web as well.
The widespread availability of the Internet comes with many risks, most of which center on protecting data and preventing misuses of information. As former Google CEO Eric Schmidt said in 2010, the amount of data generated every two days is equivalent to all the knowledge humans accumulated between the beginning of civilization and 2003. The internet’s ecosystem is very different today from even 10 years ago. It’s time that the data industry responded to this proactively.
Solving the Privacy vs. Personalization Conundrum
Digital literacy is on the rise, which is good news for privacy concerns. People are thinking through their privacy needs and what allowances they’re willing to make when they download popular apps. Snapchat, Facebook, and WhatsApp are fun, but users question whether it’s worth trading their privacy to participate in social media.
As the public becomes more educated about data security, companies will begin to offer solutions that balance privacy and personalized functionality to avoid losing customers. My colleagues and I at Ryerson University believe the way to achieve this is through UX design.
For one, we emphasize the concept of data privacy, which means embedding privacy standards into design specifications. The idea applies to technology, certainly, but it works in business practices and physical infrastructures as well. Rather than react after breaches occur, privacy management is written into the DNA.
Beyond that, we see putting control into the users’ hands as the future. For example, Joe works a 9-to-5 job. According to his contract, he must respond if his boss pings him on a social network during those hours. The boss can see granular data about Joe’s location so he can meet with him or give him assignments. However, if the boss pings Joe after 5 p.m., Joe is no longer contractually obligated to respond.
Joe can easily adjust his privacy settings to show vague, generic location data to his boss during his off hours. Instead of a specific address, the boss might be able to see that Joe is in the city but not his precise location. Joe can also enable his settings so his wife, for example, still has access to the granular data.
Because Joe can set his own privacy terms, he trusts the social media network. He gets to say, “This is how I want to share information” on a case-by-case basis. People feel far more comfortable giving out their data when they get to control who sees it and when, rather than when a corporation decides the rules — in a manner too complex for them to comprehend. Understanding context in privacy management is essential.
UX design that puts control in users’ hands should assuage consumers’ fears about how their data is being used. The proactive emphasis on building security into the infrastructure also helps guard against data breaches, which is increasingly important as people share more of their lives online.
Achieving the Privacy Balance
Creating privacy-centric user experiences is not without its challenges. Before we see widespread breakthroughs, researchers and technologists must overcome a number of hurdles, including:
1. Siloed research: Some people who work on data privacy might focus only on cloud security, but they don’t talk to interface designers. There’s no communication between the scientists and the creatives. The future of data security relies on multidisciplinary teams that can create well-rounded UX and security designs.
For instance, Google currently allows Gmail to use the data in consumers’ emails to understand their patterns and show them relevant content and products. This system could be improved to work more like a real-life assistant.
Users could turn the machine learning service on and off as needed when they’re searching for something specific. They would feel more in control knowing they could turn off those algorithms at any time, rather than have Google constantly access their personal messages. But it takes a multifaceted team to identify and implement such solutions.
2. Generational preferences: Millennials are more trusting than older generations when it comes to disclosing their data online. A 2014 Mintel study showed that 6 in 10 Millennials would give personal information in exchange for some kind of bonus or enhanced customer experience. They were at least twice as likely as Baby Boomers to share their cellphone numbers, social media profiles, and credit scores.
Privacy issues demand more than a one-size-fits-all approach. Designers must understand the context within which users interact with their sites. People’s ages and locations can significantly affect their willingness to share their information, so contextual privacy policies and design are necessary to meet consumers’ varying needs.
3. Complexity: Data researchers and computer scientists aren’t the only ones who need to be in on design conversations: Lawyers and developers must be in the mix as well. Privacy and security are complicated areas, and companies often don’t want to devote the time and resources to overhauling their approaches.
That complication extends to the user experience as well. People commonly complain about Facebook’s confusing privacy policies. It’s overwhelming, so users often give up and decide to use it without fully understanding the terms, hoping for the best.
Simplified policies would allow designers to educate users through simple, intuitive interfaces. I work with one company that alerts customers to their privacy risks in different situations. If I connect my smartphone to a hotel’s Wi-Fi while traveling and open a banking app, the app shows a colored gauge indicating my risk for a security breach. Throughout the experience, I know which data points the app is accessing and what the parent company knows about me. This transparency is reassuring and enhances my user experience.
Privacy Power to the People
Improved security depends on people having greater control over what they share in personal and professional settings. Best practices are emerging around tiered data release systems that keep company information more secure by disseminating information only when necessary, and similar systems will eventually become commonplace.
Privacy-focused designs empower users to control their data. This creates a greater sense of trust and improves security across the board. Corporations releasing patches in response to breaches is not a long-term solution. An educated, privacy-savvy public is the answer.
Dr. Hossein Rahnama is a recognized figure in ubiquitous and pervasive computing and the founder and chief product officer of Flybits, the context-as-a-service company. His research explores artificial intelligence, mobile human-computer interaction, and the effective design of contextual services. Hossein has 30 publications and 10 patents in ubiquitous computing, and is a council member of the National Science and Engineering Research Council of Canada (NSERC). Hossein is also a visiting scholar at the Human Dynamics group at MIT Media Lab in Cambridge, Mass. He has a doctorate in computer science from Ryerson University.