UK Labour "closer scrutiny of algorithms" will be useless. If you are lucky
It looks like the UK Labour party will soon call for closer scrutiny of tech firms and their algorithms. If all goes well, it just won’t work, and that will the end of it. Otherwise, it will be really bad.
After the 2016 Christmas break, a Labour’s industrial paper will call for suggestions on “how tech firms could be more closely supervised by government”. The algorithms, that is the formulas and rules used by those firms to run their services, are closely guarded trade secrets. It is those algorithms that, processing personal data, manage how the users of those firms interact with each other, and generally see the world, and consequently take decisions of all sorts.
Services so crucial should not be ran, configured and controlled, without any effective and independent scrutiny, by private, for-profit organizations, and on this I agree. For the same reason, and of course in the public interest, governments would really like to regulate those algorithms. Point is, “it’s difficult to regulate effectively” something of which you know nothing about. Therefore, the authors of the paper suggest that UK government agencies should get access to those algorithms. And this is where the problems start.
The argument for getting access to the algorithms include:
- “The outcomes of algorithms (e.g: what news you see or not, if you are discriminated by being denied access to certain jobs or commercial offers, which of your data are shared with third parties…), are regulated [but] how do we make that regulation effective when we can’t see the algorithm?”
- “If people were falling very ill after drinking Coca-Cola, the company would have some duty to share what could be causing that”
Sorry, but no. The comparison with Coca-Cola is misleading at least, and algorithms alone are useless anyway. If people fell ill after drinking Coca-Cola, the police would have to inspect Coca-Cola factories, because reading the formula without checking how it is really applied would be useless. But that would not be a problem, because it would only involve the police, Coca-Cola and its employees. In other words, if you fall ill by drinking Coca-Cola, the police needs not see your private diary, or your whole phonebook.
In the same way, having only the algorithms of Facebook (but the same is true for Google Plus, Weibo, WhatsApp, Telegram, Gmail, AirBnB, Uber…) would be completely useless, if the goal is to make sure that that company complies with all regulations about hate speech, equal opportunies and so on.
The only way government agencies (in UK or everywhere else!) could actually check something like that would be to have not just full, but continuous, real-time access to:
- the computers on which Facebook actually runs
- all the data on which they run. That is, all your contacts, posts, “private” messages, pictures, people tagged in them… everything.
Can you see what is really at stake now? If government agencies get algorithms, and nothing else, they’ll have nothing. Should they get, instead, what they would actually need to accomplish the stated goal, they would make Big Broter look like your auntie eavesdropping on your calls in comparison.
The solution? In the long term, it includes never having everybody’s data in the same place, that is moving to decentralized services like the percloud. In the short and medium term, I don’t have a complete answer, but I am sure that such an answer includes:
- NOT believing that access to algorithms alone accomplishes anything
- NOT giving governments full access to the inner workings of certain services
- placing less and less of your personal data inside Facebook and all other similar services from now on
Who writes this, why, and how to help
I am Marco Fioretti, tech writer and aspiring polymath doing human-digital research and popularization.
I do it because YOUR civil rights and the quality of YOUR life depend every year more on how software is used AROUND you.
To this end, I have already shared more than a million words on this blog, without any paywall or user tracking, and am sharing the next million through a newsletter, also without any paywall.
The more direct support I get, the more I can continue to inform for free parents, teachers, decision makers, and everybody else who should know more stuff like this. You can support me with paid subscriptions to my newsletter, donations via PayPal (mfioretti@nexaima.net) or LiberaPay, or in any of the other ways listed here.THANKS for your support!