Back in the 90s, I used to design automated systems to help pilots fly civil aircraft. Today, this would probably be called designing artificial intelligence (AI) or smart systems or something similar. One of the big questions for us back then was deciding how much responsibility to give to the aircraft to fly, manage failures and generally make and execute decisions, as well as how much to leave to the pilots. One of the hot issues was whether the pilot or smart systems should have the final call when things got tough.
Controversially, in several cases, authority was given to the aircraft because the aircraft could manage specific situations better. Aircraft aren’t subject to human cognitive biases, don’t experience misleading illusions and don’t get scared and suffer the impact those biases can have on attention and decision-making. So, slowly and carefully, the industry transitioned responsibility (or more accurately, ‘agency’) for the safety and well being of passengers from the pilot to the smart systems. And look! Flying is now safer than it’s ever been, and safety is trending upwards.
Perhaps driven by necessity, passenger sentiment and a “take no prisoners” regulatory environment, a lot of difficult automation problems have already been addressed in aviation. But the really hard decisions in automation are not about the technology (which now exists), but rather decisions about people. How they will respond and feel about the transfer of agency to smart digital assistants? What will and won’t be ethical, useful or valued by customers and be in their interests?
It may seem like a bit of a leap, but the kind of decisions that were being made back then in the safety critical sector of aviation are very similar to those we need to consider today in “quality-of-life” critical sectors. The area which I will discuss here is banking.
Of course, it’s pretty self-evident that how you manage your money and the money you’ve borrowed can make a huge difference in your quality of your life. However, most people are not particularly well-equipped or inclined to do the research needed to make good financial decisions. In addition, simple day-to-day banking sometimes trips us all up: your direct debit pushes you into un-agreed overdraft; you miss that credit card payment; you have some cash languishing in an account that pays no interest whilst your overdraft carries on charging you 15%, etc. We tend to find day-to-day money management, as well as optimizing and planning our financial life, to be daunting, complex and a bit of a drag.
To compound this, as the banks and the Financial Conduct Authority know, the public are as prone to bias, ignorance and accidental manipulation as any pilot (see the FCA occasional paper series kicked off with “Applying behavioural economics at the Financial Conduct Authority,” 2013), and of course we aren’t trained to cope with these problems.
The introduction of automation to assist the retail-banking customer hasn’t been as focused as that to assist the airline pilot, but there are notable exceptions. In 1963, Luther Simjian filed a patent for a device that allowed customers to deposit cash without going to a bank. It was designed to replace the bank teller who would pay out money, take deposits, hand out stationery and authenticate customers through sight and signature. They fronted the bank’s brand and ducked and/or hit the panic button when looking down the wrong end of a sawn-off shotgun.
The machines didn’t catch on with customers, and Simjian later explained, “The only people using the machines were prostitutes and gamblers who didn’t want to deal with tellers face to face.”
Those original users are still using ATMs, but so is everyone else. And since those early days, our relationship with banks and technology has changed a lot. In banking, we’ve now given the simple tasks to ATMs, and worryingly for the banks, our relationships with the bank and its staff is now approaching indifference. I don’t claim these two things are linked, but maybe.
In personal banking, the easy-to-automate stuff allows the bank to reduce costs and delivers convenience to the customer. But the real promise for banking is getting smart systems to do things for the customer that they can’t, or don’t easily do for themselves. For example, things that require judgement such as spotting patterns in behaviour, weighing options, extracting meaning and making decisions. Really quite complex stuff.
In spite of their potential value, banks have yet to design smart systems that customers find useful when making financial choices or in day to day banking. Although money management tools abound, they haven’t been widely adopted and don’t appear to have been designed with a real understanding of what bank customers need and want.
Stephen Walker, the author of Forrester’s “The State of Digital Money Management 2014” report, believes uptake has been slow because the tools don’t offer targeted, contextual services to customers making purchasing decisions. In other words, they aren’t personalized, relevant, or available when needed. But these are the characteristics that smart technologies such as IBM Watson and undoubtedly the content of Facebook Messenger’s bot store will offer. Until now, it’s been the tech that’s held back development of smart financial tools, but that’s changing fast. Now UX designers need to do some serious thinking to create the practices that are needed when designing useful smart technologies that ordinary people can use.
There are some genuinely new ways we need to think about the design of these systems, and the one I want to focus on is termed “agency.” When you think about agency (or more simply, “responsibility” to do things), it can be quite surprising how much you have already handed over to your bank’s automated systems. Bills get paid because you’ve delegated to your bank responsibility to pay standing orders on the right day for a specified amount. You’ve given rather more agency away with a Direct Debit Mandate where you’ve instructed your bank to pay around about a certain date a variable amount of money that you haven’t defined. This happens in the background. You don’t think about it and you probably don’t even check your statements to make sure it’s happened. A tad complacent, but it’s normal and very human.
You take it for granted that your bank will lend money they are keeping for you to whomever they want. You just hope it’s not going to the wrong places and invested in things you wouldn’t dream of supporting.
So, you’ve already handed some level of control to your bank’s systems. The control is transactional, It’s not based on decision-making, but it is based on trust (let’s not go there…) You still make most of the decisions, such as what you pay for.
However, while smart systems are good at making accurate predictions and good decisions in bounded, closed systems where all the variables are known, it’s much harder for them to do this in chaotic, open systems. Still, they are getting much better at reasoning on the basis of incomplete and uncertain information, something that until recently has been a uniquely human quality. Just look at Google’s object and people recognition capabilities as evidence of this.
In spite of what your partner thinks, your financial behaviour isn’t chaotic—it’s defined by underlying patterns. In the patterns in your behaviour lies predictive power, and that power could be harnessed to help you a lot. Yet I doubt your bank has tracked your behavior, interpreted it and proactively communicated to you anything useful to help you make better decisions, other than block your card when you go abroad.
In financial services, really great service is only available to the wealthy, and it’s delivered by specialists who actively manage their client’s wealth. However, if the recent history of personal and connected technologies tells us anything, it is that technology has huge potential to empower and promote information equality (but only for those with access to that technology).
Let’s imagine what it might be like if the banks prioritized the delivery of an AI-based service for ordinary customers that would enable them to access the quality of guidance and hands on assistance that only high net worth individuals get today. In fact, we are starting to see independent companies like Personetics picking up the torch and mining customer behavioral data on behalf of banks. They are doing this to enable the banks to deliver smarter services to their customers based on their actual behaviors. FinTech entrants such as Digit are offering plug-ins that use smarts to make small, helpful interventions, and, as Lego has said, “Everything big starts small.”
These are the first breaths of the winds of profound change in retail banking. It’s understood that the banks have cumbersome technology legacies that prevent them from moving quickly, but if they don’t give customers smart systems to help them behave smarter with their money, others will get in between the banks and us, and do it.
Imagine you could instruct a smart, vigilant, proactive assistant to make the best of whatever resources you have; to look for opportunity to do things that you may not see or even think about, or want to think about; to always check to make sure that it’s optimizing to your goals and watch for things that may be going wrong; that would suggest new behaviors and better ways to do things; that could make some decisions and didn’t get in your face or nag you the whole time; in fact, something that behaved like a personal adviser. I suspect that if you trusted such a thing, you would welcome it and use it, and so would a lot of people.
We are on the cusp of a legislative change in Europe in 2016 called PSD2 that will deliver that change, and in Britain, the Treasury are pushing forward the Open Banking Standard that aims to deliver the promise of PSD2 early (and a few other things, too). These initiatives will require banks to allow customers to give permission to third parties that can authenticate them (like Google, Apple, Paypal, Facebook and many others) to circumvent the banks’ systems and pull data directly from customers’ accounts through APIs to execute transactions (like payments). These third parties will be able to build experiences that can aggregate information from all of a customers’ accounts across all their providers, track and interpret their behaviors. With that wealth of information and intelligence what do you think they will do?
It’s pretty clear that the prize for these players will be to create smart customer experience layers that deliver wholly new levels of insight to empower you, and enable you to become wealthier, more secure, more stable, more in control, more aware, more whatever it is you need. The goal is to give you the kind of advantage you could only get if you had a few million squirreled away in a private bank, with a personal banker thinking everyday how to help you get where you wanted to go, even if in your case, it’s just until the next pay day.
Starting now and in the future, we’ll see how they work out how these systems should behave, look, feel and communicate. Defining the nature of the dialogue between them and us (don’t think of dashboards!) and how that discourse will be managed is a topic that is starting to engage the minds of experience design specialists across all sectors, not just financial services.
We have a lot to learn from those who’ve already thought these thoughts in other sectors such as aviation, where the focus on regulated safety, not profit, has allowed a great deal more to be achieved in rather less time.
It’s time to get started on the exciting, complex and certainly intricate task of designing the next step in experience design: the smart UI.