Data transparency and trust … understanding your customers 

March, 2023 - Shoosmiths LLP

Cisco has recently released its 2023 Privacy Benchmark Study which explores the impact of data privacy on organisations around the world and why data protection compliance remains mission critical for businesses. 

The study finds that 95% of companies understand privacy to be a ‘business imperative’, but Harvey Jang, Cisco’s privacy chief, highlights that ‘when it comes to earning and building trust, compliance is not enough’. 

Privacy spend: a good deal?

In most cases, yes. The study reports that despite uncertain economic conditions the amount of money businesses spent on privacy compliance did not decrease in 2022, with the $2.7 million average spend on privacy compliance standing at more than double the 2019 figure.

Why are they still spending? Over 70% of businesses report significant or very significant benefits as a result of investment in data privacy compliance. The list is long and includes a boost in customer trust, increased business attractiveness, increased innovation, reduced delay in sales, operational efficiency and (of course) less damage and loss from data breaches. Overall, organizations report a 1.8 times economic return on privacy investment.

Transparency – the disconnect between businesses and consumers

Let’s take another look at that point about customer trust. 

It is one of the key benefits of investment in privacy protection: previous surveys revealed that 76% of customers would not buy from a business they did not trust to handle their data, and 81% said that the way a business handles data ‘is indicative of how it views and respects its customers’. 

However, Cisco highlights that there continues to be a disconnect between what businesses think is important, and what consumers think. Most businesses maintain that compliance is the most important way to gain trust. Customers, frankly, take this as a given. For them, it’s transparency about how their data is used which is their highest priority. 

The biggest topic of concern: artificial intelligence (AI)

The issue of trust continues to challenge as technology develops, particularly around the use of AI. The 2023 study reveals that 60% of consumers are concerned about how businesses use AI, with a shocking 65% stating they have ‘already lost trust’ in the approach of businesses that use it. In contrast, 96% of businesses feel they already have the necessary processes and standards in place to protect consumers: leaving potentially huge misunderstanding between businesses and consumers.

Taking on the AI trust problem

This lack of trust means that customers at the moment overwhelmingly want to be able to opt out of AI systems. But is this a realistic approach? It’s certainly a red flag for the four out of five businesses who think this is not a realistic option. Businesses understand that they have to tackle the problem through increased transparency, which – if the data protection survey results play out for AI – should indeed solve the problem of trust (even if customers are not yet sure). 

But, as with the ongoing regulator wrangles over transparency in AdTech, there’s a central difficulty: how exactly do you explain the complex mathematics of algorithmic decision-making to the people who are affected by it? The first stage is getting a level playing field for measuring and describing processes: something quietly happening around the world in universities and the public sector, like the recent UK government Algorithmic Transparency Recording Standard. Yawns all round, but it’s these quiet pieces of work which will help build a usable AI ecosystem for all.

AI Regulation

One thing the Cisco survey didn’t ask: will having highly regulated business use of AI increase trust? 

Possibly not, but it’s coming anyway. From around 2025, the EU Artificial Intelligence Act will put a legal framework around certain AI systems used in Europe, banning those that impose an ‘unacceptable risk’. There will also be new liability rules to allow potential claims for damages where an AI system has caused loss. The UK is some way behind but a white paper is expected from the Office for Artificial Intelligence very soon. 

The US situation is more difficult to read. While even Joe Biden knows that algorithmic transparency is top of the digital agenda – he said so in his recent State of the Union address – the chequered history of data protection regulation and sometimes jaw-dropping disregard for basic privacy rights in some US states means that even if the US gets an effective federal law, trust may take a long time to repair. And if the survey’s conclusions on data protection hold true for AI, it may be a thankless (though necessary) task for businesses – as customers will come to take it as a given that legal compliance is baked into systems. 

The take aways 

The 2023 study conclusions in a nutshell: investment in compliance is a no-brainer – even if customers don’t see it that way. Key new investments must be in transparency and trust, both for data protection and AI. Quite how we get there may be a harder nut to crack.

 



Link to article

MEMBER COMMENTS

WSG Member: Please login to add your comment.

dots