CX vs. Privacy: Big Brother is watching You(tube)

Privacy is becoming a major challenge for customer experience designers. We walk a fine line between personalizing services and inadvertently misusing personal data. This complex problem is easy to understand, in theory. Homes and pockets full of connected devices provide data that enable better product and service design because we have greater insights into the customer journey, both inside the home and outside it. That data-supported insight, combined with ethnography and traditional user research, helps us design ambient, frictionless customer experiences.

Photo credit: Sebastian Scholz

Photo credit: Sebastian Scholz

Except, let’s face it, user privacy is an ethical minefield and in the world of frictionless experiences, privacy issues are fast becoming a major source of user friction. For example, I recently watched a video on a Youtube channel featuring hilariously bad products bought on Amazon, these products are now popping up for sale in my Instagram feed. Do I think that’s a coincidence? No. Can I prove it? No. And how do I rate the CX of the multiple platforms and devices involved in that? Well, I’m writing a blog post about it, and it’s going to argue there is an urgent need for human-centric journeys that make it feel less like Big Brother is watching Youtube…

The privacy problem for CX designers

No matter how much people are concerned about privacy, most experience designers could use more data in their research work. A healthy, granular set of user behavior metrics never did any harm to a good customer journey map. However, everyone knows when we see a privacy agreement on a device, software, services or whatever, most of us click agree without reading the whole privacy agreement in detail. We assume the smart lightbulbs won’t try and track our GPS signal, or sell that data to a mobile ad company. Right? And that’s a big problem because some products do that. In fact, some products do a lot more. So, as an industry, the CX community has to consider carefully the implications of privacy agreements and their effects – both positive and negative – on the customer experience, and that raises some very complex issues.

The ‘You don’t know what you don’t know’ issue

Back in 2016 when Samsung advised its smart TV customers not to discuss ‘sensitive topics’ near its smart TVs because they were ‘always on’ and might record private conversations, what happened? Samsung changed their processes, fast, to avoid crippling legal action. However, this was nothing new. In the case of other brands of smart TVs, there have been a number of court cases that have shown customer data (viewing habits, browser, social media and streaming app usage metrics) is being collected and used in breach of privacy agreements with the TV provider.

The providers have usually explained this by saying they didn’t even know they were collecting that data, it’s automatic. No, really. If it wasn’t for the test cases where customers have proven unusual data is flowing from their TV out into the web, nobody would have been any the wiser. What that shows is less about individual cases and more about the fact as a consumer, you have no easy way of tracking if the privacy agreement you have signed is being honoured.

This challenge makes the privacy agreement unlike any other contract in the history of contracts. Think about employment, rent, product warranties, life insurance, holiday terms and conditions and so on. Those kinds of contracts are easy to understand and ensure they are honoured by both sides. Monitoring whether your TV is streaming usage metrics about your device is in breach of a 1000-clause agreement you clicked without reading? Not so easy. There’s a serious lack of human-centric design in privacy agreements, which means there is a growing lack of consumer trust in tech brands. The lack of transparency in most privacy agreements are bad CX just waiting to happen, when someone discovers by chance their data is being used in a way that comes as a nasty surprise (because it was buried in hundreds of pages of legalese).

The ‘Why does it need to do that and how does it do it?’ issue

In 2017 Amazon’s Alexa was famously dragged into a murder investigation, with Amazon being ordered by an Arkansas court to hand over recordings related to an Echo device in the accused’s home. In that court case, the accused also had a smart water meter, which was analysed to see if an unusual amount of water had been used (the victim drowned in a Hot Tub). Neither device had any material to the case onboard, and the case was dismissed as a tragic drunken accident, not murder. However, people naturally became concerned that Alexa was spying on them (not just would-be murderers, but ordinary people too).

In 2017 Amazon’s Alexa was famously dragged into a murder investigation, with Amazon being ordered by an Arkansas court to hand over recordings related to an Echo device in the accused’s home.

Now, obviously, a voice assistant has to listen for its wake word, or it couldn’t work. Additionally, Amazon’s Alexa doesn’t record what users say and keep it, although, you don’t know what you don’t know.  It does convert your words into text, run it through a machine learning system, parse it as a JSON string into all kinds of APIs and third party scripts, then process it back into speech and send it back to your device. Tracking your private data in that kind of complex process is hard.

Harder still, though, is justifying why thermostats or smart bulbs need to share data about your usage. For a CX designer, that kind of data could be very useful. Imagine a home that turned up the heating automatically on cold days, or dimmed the lights when you watched a movie, without having to be asked. It’s an end-to-end use case that makes a compelling argument for ambient CX. However, if that same data is also sold to an ad-retargeting company so they can slather adverts for winter jumpers or smart bulbs all over your phone without you actually wanting them to? Same data, but very different CX outcomes.

Smart watch

The ‘Privacy is a community issue’ problem

In January this year, one of the new generation video doorbells recorded a man licking a family’s front door for three hours. That’s disturbing, clearly. However, for all the other people who might have legitimate business in that doorway, what’s also disturbing is they are now on film without their knowledge or consent. That’s a whole new privacy problem waiting to happen, because your visitor’s face could be on a facial recognition database somewhere being used for everything from anti-terrorist covert ops, through to firms that analyse facial expressions to target behavioral advertising at people using social media streaming video calls. So that’s the CIA and Cambridge Analytica, maybe? Oh, and…er… you installed and signed a privacy agreement for the machine that’s invading their privacy? Ouch.

If you think that’s not really a CX issue, ask yourself how customers might feel when they learn that, technically, capturing images outside their own property and storing them electronically without consent could constitute a breach of both EU GDPR and the UK Data Protection Act, depending on what they do with that data. For example, if they capture video data via a service that uses a privacy agreement to tune its face recognition algorithms, the unwitting customer could be using someone else’s data without their consent. Maybe. It’s the greyest of grey areas, but if the camera can see the street or a property next door, there’s potential for a privacy conflict with people who appear regularly in front of that camera. That’s just for starters. The police, or council, could become involved (and have done in many UK cases) if the unauthorised video could be considered harassment or voyeurism. Plus, if the authorities discover you have shared those funny vids of the postman falling over in the ice on Youtube or whatever… yeah.

There’s arguably a lack of human-centric planning, design research and customer journey mapping in a product that can do that to a customer who thought they were innocently buying a smart home gizmo, not a potential civil lawsuit or criminal prosecution. More importantly, there’s community to consider in that scenario. This is a new dimension for CX designers, but hugely important as users take devices out into the world around them, and in doing so, capture other people’s data. The community represents a form of super-user now, and as such, we need to research the collective use of technology in public spaces.

There’s arguably a lack of human-centric planning, design research and customer journey mapping in a product that can do that to a customer who thought they were innocently buying a smart home gizmo, not a potential civil lawsuit or criminal prosecution.

Resolving privacy issues and CX

A lot of the problems CX designers face when navigating the complexity of privacy issues stems from the origins of digital privacy concerns – the digital exhaust as Google called it, back in the day. It refers to a time when online services first realized they were collecting all kinds of information they weren’t expecting about users, like their location, age, demographics etc. There was a concerted effort to take all the digital exhaust data that was costing money to host on servers, and monetize it.

Fast forward twenty years, and we have a world where your health insurance company can reward you if you complete 10,000 steps per day on your Fitbit (who have a great privacy policy, btw) – that’s great use of opt-in, ethical data usage – but also a world where your smart gadget controller app is recording your GPS, and your doorbell is recording your private conversations, just in case they could use that data somewhere down the line. That’s less great.

Clearly, the answer is simple – as all CX professionals know – human centric design is the key to fixing privacy issues. This approach sits at the heart of projects like Solid, Tim Berners-Lee’s latest endeavour with MIT, to create a privacy model where the user grants permission to every service, controlling precisely what they can – and can’t – access, no privacy agreements in sight. It’s a model that’s been discussed in various forms for years, sometimes called a ‘data escrow’ or ‘data vault’ model. It revolves around the idea that rather than harvesting your data and selling it to advertising companies – often justified as a fair means to finance the costs of giving you a free service like a social network – tech provider companies should reward people for letting them access their private data.

Privacy is an area where CX has a huge role to play, not just through human-centric design and ethical use of data, but as an industry driven by design research and user testing, CX professionals are also experts in handling data, GDPR and all things confidential. Privacy used to be a legal issue only, but now, it’s a design issue as well. As tech gets smarter, wearable, and increasingly invisible within the home and workplace, the advice to the tech world from CX designers is simple: put the customer at the centre of everything you do, and that can only mean good news for their privacy… and your bottom line.

Snr Research Director

Sutherland Labs
Profile Image - Mark Brady

View other blog posts by Mark