Professor Eerke Boiten from Cyber Technology Institute, De Montfort University, was recently asked by BBC3 to comment on the security of our Facebook data after the Cambridge Analytica Scandal.
His comments were quoted in the article:
‘I downloaded all my Facebook data and it was a nightmare’
Ever wondered what your data actually looks like? by Radhika Sanghani
Here you can read his responses in full to the questions raised in this interview:
- Even after the Cambridge Analytica scandal, how safe is our Facebook data? For instance, how do we know our info isn’t used again and again when it comes to FB custom audience/profiling?
EB: Facebook haven’t changed anything substantive since the Cambridge Analytica scandal. They still do profiling on their customers, on all kinds of criteria including sensitive. This means that companies can still market via FB on the basis of race, or on the basis of mental stability. Even when such routes are not directly available, “lookalike” audiences can be created to market to people with similar views and interests. They are trying to stop “political” advertising around particular elections and referenda, but the stories coming out of that suggest they don’t really know yet how to even detect political advertising. A lot of the things FB have said around the CA scandal have been proved to be incorrect, for example that they stopped the sharing of friends’ info via apps as soon as they found out it was being abused.
- What steps, in your opinion, would actually make our data safer?
EB: Now this is where GDPR should make a difference. Companies have to give insight into what they do with people’s data, and show that they can justify what they are doing with it. Experiments relating to mental health, like Facebook have done in the past, would need very explicit permission from the guinea pigs – which they probably wouldn’t give. The problem is that Facebook, Google, and the like have become so large that it is very hard for anyone to properly inspect all of what they are doing. At the moment, we can only look at what creeps out at the seams, along the line of: “if it turns out they’re able to do this, internally they must be applying an algorithm which does profiling for that”. So a significant increase in budget for organisations like the ICO would be essential to keep the internet giants in line.
- Should we – digital natives – just resign ourselves to giving over all of this information about ourselves? It’s become so accepted but does it have to be this way?
EB: The problem isn’t even with the information that we give away itself. Most of us know how to apply the privacy settings that make sure it doesn’t get any further than we want it to go. The CA story was a scandal for many people because it violated their expectations about such control of their data: apps on someone’s Facebook leaking information about their friends without permission.
The main problem is with information that is not knowingly given away, such as Facebook like buttons and cookies tracking our web browsing, or Google Maps recording our every movement – and with the information that can be deduced from such tracking on the internet or in the real world. It’s hard to even be aware of how much such tracking exists, and you certainly don’t get many privacy controls on how it is used or passed on. For this, the GDPR should help too, but again it’s hard to enforce a law against such large scale processing by large companies that mostly sit outside the UK and the EU.
For the full article on BBC3, please visit: https://www.bbc.co.uk/bbcthree/article/93d1393a-1c12-485f-b7fe-5146cd48c12c