I attended an event on the Right to be Forgotten (RtbF) at City University yesterday, organised to launch journalism prof George Brock‘s book on the topic. For those of you who don’t know, RtbF refers to the process following the 2014 Google Spain judgement (Costeja), and involves the removal from specific Google search results from searches for specific people in limited circumstances. Hence the term “delisting” is actually more appropriate than “RtbF”. Just felt I needed to explain that, as George Brock despite having written the book still talked of “taking down URLs” several times.
I missed Brock’s presentation that started the event, arriving during the panel discussion. Missing from the panel was Julia Powles who was sick – but she did provide her RtbF canon on Twitter.
Peter Barron, EMEA head of communications represented Google. He made two major points that I feel need further rebuttal – in addition to very valid points already made in particular by Paul Bernal and some of the audience.
First, Barron claimed it was wrong for Google to be put in the position of having to balance the various rights on so many RtbF requests. On this point I agree with Google – them as “judges”, with under-resourced data protection agencies as an appeal mechanism is highly unsatisfactory. A significant response from Bernal to this was that RtbF evaluation is yet another algorithm – actually operating in tandem with Google’s other “delisting” algorithms (e.g. for IWF and ©) and indeed with PageRank itself – and so really nothing extraordinary for Google to be doing. Hold on to this thought for my second point, too.
On the train to London, I had been reading Andrew Murray‘s IT Law book – I’d made it to the chapter on Governance and in particular his approach of “network communitarianism”. This train of thought leads to a solution for Google’s heavy judging burden: they should be sharing their responsibility with the community! Does anybody have a plan for that? Actually, yes …
Barron claimed Google were keen to be as transparent as they could be on RtbF. In response, Brock rightly dismissed their Transparency Reports as typical big company PR. It provides a drip feed of information about the Google RtbF decision processes, much like Barron did that evening: “57% denied”; “spent convictions play an important role”; “I’ve given Julia Powles info in conversations”.
Over a year ago, 80 academics, led by Ellen P. Goodman and Powles, asked Google for much more and more specific information on their RtbF decision processes. (Bernal, Brock, Murray, and I were among the 80.) So far, I am not aware of any significant response from Google; it’s a pity nobody asked Barron yesterday. He did hint at the privacy-sensitivity of the data involved, but it’s a bit rich for Google to be reluctant to share sensitive data with academics when they do expect academics to generously share sensitive data with them.
By offering to analyse the RtbF decision processes, the academics provide a way for Google to offload some of its unwanted responsibility onto the community. Google’s refusal to engage shows they would actually rather have responsibility than accountability. If any ethical discussion of this takes place by Google, it must be within their elusive ethics committee (which is not to be confused with the Deepmind Health independent reviewers or the RtbF advisory council).
Accountability is even more central to my second point. Barron talked at some length about notifications – i.e., when something has been delisted, the publisher of the information is informed of this by Google. I have argued before that this is done in the first place to stir up a censorship storm. I concede Barron’s comment that these storms have subsided a little now (though several newspapers and the BBC still publish delistings – you will understand why I won’t link to those).
Barron’s justification for these notifications sounded quite credible. Delistings interfere with the publishers’ rights, so they deserve to be told. However, we need to examine that very closely. If Google does something, on behalf of an “injured” third party, that removes a publication from certain search results, Google wishes to be accountable to the publisher for the interference with their rights. So what if Google does something on its own behalf that removes a publication from certain search results? Or just moves it down so far the search results that it is effectively deleted? Would Google admit that the outcome of PageRank incurs an accountability to the web page publishers for how highly ranked their pages are? And, given that there are no third parties involved, would Google seek to accommodate challenges to ranking outcomes on the basis of publishers’ listing rights being infringed? Of course not.
So Google’s “accountability” through notifications is extremely selective. Google chooses to be “accountable” for something it doesn’t want to be doing and for which it can lay the blame elsewhere. It supports naive journalists in their view that Google search is a public good that presents an objective view of the world which gets distorted by RtbF. It wants accountability to the world for a minor modification on its search outcomes, while shielding the central algorithm that competes with Facebook’s newsfeed algorithm for the role of the least transparent and most powerful gatekeeper of our information society. Transparency and accountability? It’s a sham.