hero__bg
Back to all articles

The Joy of GDPR

Happy GDPR week everyone!

This Friday, the General Data Protection Regulation comes into force, the most ambitious data privacy ruling since, well, ever. I’m not going to go through the specifics – there are plenty of vastly overpriced seminars for that – but it’s basically about giving EU citizens more rights and control over their personal information, info on how and why it’s being collected and used, and the chance to opt out if they want. Some of this was already in law – but now there are monster fines if companies don’t comply. 

You will probably know it from those endless emails. Hundreds of them, from places and people you’d completely forgotten. All of them cajoling, begging or pleading for you to ‘stay in touch’ or ‘to never let go’. They’re being sent because GDPR is about ensuring companies can’t just sweet talk you into ticking one vaguely-worded checkbox to put you on their spam list forever. I can’t be the only one who is taking a great pleasure in ignoring them all. So far, this has been the most enjoyable piece of EU legislation I can remember.

Of course, it’s much less enjoyable if you’re a business. Hard won data sets are vaporised. Wetherspoons simply decided to just delete all email data. There will be more paperwork and documentation. All organisations processing personal information – which is practically everyone these days given that the definition of personal information is loose – have to designate a ‘data processer’ who’s responsible for making sure the correct procedures are followed.

Yes this is a nuisance, especially if you’re a small company. Yes there will be clever work-arounds. Yes the lawyers and bureaucrats will be rubbing their hands with glee. However, on this occasion, that’s a price worth paying. 

The future economy, as everyone keeps saying, runs on data. We’ve already seen what can happen when the rules aren’t clear – Cambridge Analytica is one example of many. And we’re only just getting started. The volume and instrusiveness of data is exploding. Soon enough almost everything you own will be spewing out data about you. By 2020 there will be around 50 billion internet-enabled devices – four times what there are now – and each one hoovering up data: cars, fridges, clothes, road signs and books. Your precious daughter playing with her doll: data point! Your partner adding some sugar to her tea: data point! You’re running schedule: data point! Twinned with smart algorithms, which keep getting better, the potential for misuse becomes enormous. Consider it: within a decade your fridge data will know what time you eat, your car will know where you’ve been, and your home assistant device will work out your approximate anger levels by the tone of your voice. I guarantee this data will be gobbled up by analysists to target you. Are we happy with this? It cannot go on unregulated. 

Take one overlooked aspect of the GDPR: algorithms. The regulations restrict significant decisions being made solely based on automated decisions without human involvement. (“The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.”) In other words, companies shouldn’t throw you in the bucket and get machines to make important decisions about you, for example about offering you a loan or an employment aptitude test. (Unless, of course, you consent). And if a company does any of that, you can ask that it explains how it works, and challenge it. These are not theoretical problems of the techno-future. Cathy O’Neil, in her recent book Weapons of Maths Destruction, documents dozens of instances where important decisions – relating to hiring policy, teacher evaluations, police officer distribution and more – are effectively outsourced to the cold and unquestionable efficiency of proprietary data and algorithms, even though these decisions have important moral dimensions and consequences. They might look and sounds very objective but algorithms always start with a question framed by whoever is in charge. As a result they tend to reproduce the biases of their creators. For example, some police forces rely on data models to decide where to put police officers. However, crime tends to take place in poor neighbourhoods, which means more cops in those areas. That generally means more people in those neighbourhoods getting arrested, which feeds back into the model, creating a self-perpetuating loop of growing inequality and algorithm-driven injustice.  

As more decisions are taken by machines in the future – which they certainly will be – there needs to a way of making sure they are fair and publicly accountable. I can’t wait to see technologists trying to explain to lay people how and why their algorithms work in plain and clear language. I think they might find it a helpful experience – and if you’re a business that show you take this seriously, you might a competitive edge too.

The GDPR might be a slightly blunt instrument, but in the end democratic governments, and not un-accountable, international tech firms must be the ones who run the great data show. In the struggle over who controls the internet, governments and regulators been getting utterly trounced. Finally they’ve started to kick back. Good.

Originally published by Jamie Bartlett


We’ve updated our Privacy Policy to reflect legislative changes for GDPR- click here to read the new policy.

 

International digital analytics, data science or programmatics expert, and looking for a job?

Digital agency looking to expand your team with top-tier talent?

 

Get in contact with us!

loader