London, March 22nd 2010
This event yesterday morning, took place in response to drives from the European Union to consider a ‘right to be forgotten’ in relation to personal information online. It comprised three keynote addresses and three panel sessions with shorter speeches. The twitter backchannel for the event was #wmfevents, which I think included about seven or eight of us actively. It was an interesting event, although a bit too brief in places and of course coloured by the media slant – which does produce some odd artifacts in discussions of privacy. Direct quotes from memory and scribbled notes.
Richard Allen – Director of Policy EU, Facebook
Richard spoke about three main things. 1) what a social network is for, 2) what FB already does in relation to privacy and 3) the right to be forgotten. Social networks exist to enable people to connect and share. Things on them are by definition shared (things you don’t want online, you shouldn’t put on FB. He distinguished between created content (putting your soul on show) and found content (often advertising driven) but both were used by people to define who they are. Allen believes we are experiences a massive democratisation of public speech. Previously the ability to have a significant voice was limited to actors with substantial resources. The early days of the digital environment mirrored this exclusivity, with computers being massive and expensive. The problem is that much of our regulatory environment is built for this (data protection, libel etc). The 1990s saw democratisation of thet his, but blogging was still limited to a ‘technorati’ and geeks. Now technical expertise and cost are no longer a barrier to having a voice online, with the capacity to speak to the world. A mass of data is published by individuals and this is increasing.
Allen also pointed out the deletion function, for individual items of content and full accounts, however he cautioned that deletion of this data was limited to Facebook servers and not third parties (and that putting information online with no privacy settings was equivalent to publishing it fully on the internet).He also identified the download function to allow users to back up their data but also allow them to move to alternate services.
In regard to the right to be forgotten Allen cautioned that hard cases make bad laws, and that he was concerned that the extreme tail might be wagging the mainstream dog. The basic idea of user control over their own data (post, amend, review, delete) was supported but was concerned about an ‘overly prescriptive’ right. For most people, he argued, their main concern was that their data would remain accessible over time, and that far from fearing youthful indiscretions reappearing online, most people looked forward to being able to look back on their youth through social network records. Users wanted guarantees about data availability more than privacy/deletion. He was also concerned about the tendency to shoot the messenger, in which individuals with concerns about content went after the site where the content was shared rather than the source of the content.
Allen concluded by stating that we had explicitly designed networks to share stuff, and that most people were happy with this, and most people can resolve conflicts in this environment. There were however a few cases in which this was not possible, but that these were exceptional cases and should be handled as such, rather than being the basis for policy.
Tessa Mayes largely reiterated her guardian comment is free article from a couple of days ago (which I’d read this morning) in which she critiqued the right to be forgotten on the basis that a claim for disengagement was antithetical to a democratic society in which engagement was necessary and valuable. Personally I think this misses the point, and creates a straw man version of the legal claim being made (which is not really about ‘oblivion’ but much more about revocation of consent (Caspar Bowden made much this point in the following questions).
Jim Killock from Open Rights Group spoke about the difference between our sense of ourselves as public people but also with private selves. There are differences online, but individuals should still have the right to choose which one they are being. There are expectations of control over private spaces. To demonstrate this he contrasted twitter (public) and facebook (created with an expectation of privacy which is has then moved away from). He also highlighted the need to retain and determine our creative work from a copyright perspective, and some very unequal bargains between services and users, with a tendency to arbitrarily change the terms of those. The right to be forgotten was not to be seen as right to disengage entirely from society but based on a perception of an extreme imbalance between powerful corporations and individuals, and that individuals should have more rights in this relationship, including the right to take back their information. For Killock the issue of one of power balance and the loss of control over how our identities work. He was concerned that are getting more fragmented regulation, with two more commissioners outlined in the freedom bill.
Chris Pounder suggested that the right to be forgotten is not a privacy issue, but a publishing one, and that mistaking it for the former is misguided. He argued that what is in the public domain cannot be considered private and that we are dealing with data protection rather than privacy. He stated that a right to be forgotten would not work unless it could be made international (and that he didn’t expect this to happen).
Georgie Nelson from Which? Spoke about a study the organisation had recently conducted on consumer attitudes towards targeted behaviour advertising, comparing this with social networking as one of two ways of monitising personal information online. She pointed out the need for choice and transparency, but that consumers generally didn’t have enough information to be able to answer key questions. She believe this was eroding trust online and keeping some people from using social networks.
Questions were asked about the balances between education and legislation (linked to personal information and for example, job applications). The Earl of Errol suggested there was a need for people to be more tolerant in judging others. Chris Pounder suggested that transparency and rights to be informed of decision making processes (for example in job applications) were better protections than rights to object on DP grounds. There were also questions bout terminology regarding the right to be forgotten, and its balances with freedom of speech. One questioner suggested that naivety online was simply no defence. Tessa Mayes suggested that social networks were akin to private members clubs, or chatham house rules in which participants agreed to abide by some informational norms. This didn’t go down well with elements of the twitter conversation, but it has some potential merit. It’s reminiscent of Nissenbaum’s contextual and normative account of privacy and information. Caspar Bowden suggested that facebook was not the substantial problem, rather the multitude of other, less well known data processing which had no mechanisms for redress. The Earl of Errol suggested that rather than a right to be forgotten a relaxation of the right to object related to publication to a criteria of distress rather than having to prove substantial harm, would allow a balancing of interests in privacy/publication under law and some sort of tribunal system. Tessa Mayes distinctions between being a data controller, and commenting on the data of others wasn’t particularly clear and didn’t seem to have much of a basis in information theory. In response to question about educating publics, Chris Pounder suggested this was difficult primarily because people did not experience privacy, but only the lack of privacy, and when this happened it was too late.
Keynote - Baroness Buscome, from the Press Complaints Committee argued that there was always a public need for a system of speedy and hopefully cost free redress. The PCC, she said, has stood the test of time because it serves everybody. Indepedently enforced self-reguation is a workable model, with its success measure by the invisible, and reflected in the articles which do not appear. The public apparently prefers swift apologies rather than heavy fines which require lengthier processes. This process requires a significant amount of buy in, and rules agree with the media not imposed from above. Regulating online conduct is problematic. The idea that privacy is not a social norm is not true, and the online environment possibly provides an opportunity to better define (through settings and the like) exactly what we want public and what we need to keep private. Journalists now have the added resource of non-journalists. The Baroness advocated a voluntary code and guidance developed from case law. The PCC test for the use of internet published material contains five points
1) quality of information – how private is it in itself
2) context of the information (in what way was it published?)
3) who uploaded the information or consented to its upload
4) how widely available is it already? (what privacy settings is it behind?)
5) what is the public interest.
It is permissible to use information behind privacy settings in some circumstances, however, only if the public interest overrides the individual interest in privacy. The law cannot be expected to keep up with both the market and technology developments.
Peter Murray from the National Union of Journalists spoke about the privacy tensions between uncovering misconduct on the part of the powerful and projecting the work of journalists. This included the danger to the journalistic defence of sources presented by routine monitoring by the state through surveillance technologies. He also argued that the Freedom bill does nothing to remove much of the routine surveillance and that rights to privacy are therefore still under serious threat.
Clarence Mitchell, spoke about his experience as legal counsel for the McCann family. His perspective was that it was easy for people to become journalists online without knowing the law, but that this wasn’t a protection. The target for legal action was when online comment was taken up by more structured journalists. People generally needed to be made more aware of their responsibilities in an area of mass publishing.
John Naughton, professor of the public understand of science, Open University argued that the idea of right to oblivion was completely inoperable the way it was currently framed, and that it was not a productive line of thought. The bigger problem is state surveillance. He suggested that the net sat somewhere between Orwell (we will be destroyed by what we fear) and Huxley (we will be destroyed by what we love) and that it could do both. He drew attention to the implications of the media ecosystem in which more computing was conducted in the cloud, with faustian bargains for free services. Secondly that mass of ordinary people can act as publishers, without understand what this means, and can all become public figures when on Facebook. This was an architectural problem, but there was an option to hold our own data on our own servers and set rules to manage it. This might be too geeky for everybody however.
David Allen Green (Jackof Kent) spoke about his own method of journalism, which he felt was only possible with the advent of social media. He put forth a typology of three privacy situations. 1) public statements, 2) situations in which an individual may or may not have a subjective or objective expectation of privacy, 3) situations in which an individual has a substantive legal right to privacy enforcible in a court (these are rare). There is no Tort of privacy invasion, an actual act of invasion of privacy is not actionable, only the misuse of personal information, not its acquisition. The question is therefore what to do in the middle situation? With a potential right that should be respected by others. Spoke about the Sarah Baskerville case and the ‘character assassination job’ by the Daily Mail and argued that just because a single tweet might be retweetable, it did not justify a paper digging through all previous tweets. He felt newspapers could not automatically use online information, even if it was not protected by privacy settings. Republication must show still show respect for individual privacy. He concluded that on the basis of this case, and its handling by the PCC, he would not longer advise people with a complaint to bother with the PCC, but rather hire lawyers and threaten a privacy suit.
The Q&A session at this point mainly focused on different interpretation of the Baskerville case and the extent to which twitter should be understood as a public medium. Because tweets can be re-tweeted without consent, it was argued, it cannot be considered private. There was also some substantial critique of permission creep and data retention creep under RIPA by Professor Naughton in response to a question (I think from an ACPO representative) about the protections supposedly built into RIPA. The costs and benefits of RIPA were never publicly discussed. There was also a discussion about what it means to be a journalist (if everybody can publish). This included some observations that a lot of the European model is based upon ‘certified’ journalists, a model which does not apply in the UK. The Earl of Errol conclude this session with an observation that any approach that involve criminalising many more people was probably the wrong way to go.
Keynote – Christopher Graham, Information Commissioner gave a talk about data protection challenges online which was fairly consistent with ICO publications and communications of recent months and years. What was interesting was quite how close it maps to a responsibilisation agenda in which individuals are placed at the heart of managing their own personal information, with a responsibility to ‘take steps they need to to keep themselves safe and secure in the online world’. What I hadn’t picked up before was how wedded the Information Commissioner is to the idea that the market will regulate and provide a competitive advantage to online firms that provide better privacy protection. I’m honestly sceptical about this – I just don’t think online service users get the feedback from the privacy costs and harms clearly enough to be making economic decisions about it that are sufficiently reflected in the market (this is also why privacy is a hard discussion topic). He stated that he doesn’t think that personal information management is too geeky for most people, and believes it is ‘coming’. He did identify opportunities in those technologies that allow greater control and specification of personal information handling. Graham suggested that not all privacy is data protection, and not all data protection is privacy; that what the law is sometimes differs from what we think it should be, and what people want; and that good practice is more than strict compliance with data protection act. The commissioner provided notice of new electronic communications regulation on data breach disclosure and tracking cookies coming in about 40 days time. EU institutions expect changes in the law to lead to changes in behaviour around informed consent, and believe it very desirably to give more control and transparency to the consumer. ICO recognises the difficulties. The ‘Personal Information Online code of practice’ is not a self-regulatory document but guidance written with data controllers in mind. It attempts to clear address questions such as ‘is an IP address personal information’. The Data Protection act is seen as showing its age, and Christopher Graham made the point that he should not be attempting to be King Canute (he had a picture of him on his office wall) or King Lear.
Philip James from legal firm Lewis Silkin had many more points to make that the time for his speech allowed. In brief, he suggested that at the moment ICO’s statements were ‘guidance in the form of a code’ and that they might be better placed as one or the other. He also suggested the possibility of a privacy association, promoting privacy ‘leaders’.
Bob Warner from the Communications Consumer Panel spoke about a research report they are currently finalising, which might be worth looking at when its published. The report looks at how consumders feel about making their data accessible. 6 out of 10 were concerned about their personal data, which increased when using mobile technology. The consumer council don’t wan tto see this undermining consumer confidence and preventing some people from using the benefits of the internet. So far, so Digital Britain. The interesting bit, that fits with the ICO statements on responsibility (and the whole identity management thing) is that 70% of consumers in the study believed that they carried the primary responsibility for securing their own data, with companies second, and government and regulators 3rd and 4th. Bob Warner felt this was a sign of a relatively mature understanding of the difficulty of the area. One might be tempted to see it as a result of several years of people being told they needed to manage and protect their personal data. The effect though, according to Warner, is that companies need to do more than keep the regulators happy – they need to be more open with the customers, as they see themselves as the decision makers with primary responsibility.Customers understand that they are making a deal when they provide information for services. Warner identified three conditions for consumer empowerment – firstly informed judgement about data disclosure, secondly an easy means on controlling this, and finally confidence that companies will respect data in the way agreed. He argued that we would need all three of these for a mature e-commerce environment. Companies need to promote a greater understanding of why personal data is collected and the benefits it will bring.
Caspar Bowden, speaking in a personal capacity rather than as a Microsoft representative was critical of the belief that market forces would correct privacya buses. He was sceptical of this and suggested that he’d seen data that the average impact on share price from a privacy scandal/abuse issue was about two weeks (I don’t have this written down, and it might have been two months – still relatively short). He was also sceptical about the capacity of increased consumer awareness to prevent privacy issues. The speed and development of the issue puts individuals in a very difficult postion in regard to assessing privacy risk, meaning they can rarely have enough information to make sensible decisions and to genuinely choose the level of privacy risk they are happy with. This should place more weight on the regulator. However there is a problem where a regulator (ICO) does not engage with an issue because it does not belive it is of concern to the public. Caspar’s belief is that the chieft weakness of the current data protection act is that the meaning of personal data is a problem, it is much narrower than the European definition, including only personally identifiable information identifiable by the data controller. It ignore the potential for collusion to reidentify. ICO has been silent on this issue, and Caspar believes that the position of the UK government will be influential in the EU renegotiations of updated data protection. Caspar spoke about how the computer science of reidentification has moved on in the past five years where what might have counted as ‘satisfactory anonymisation’ no longer was, given the relative ease with which a rich data set with inter-personal connections in it could be re-identified. He spoke about a data set of all UK telephone calls exported to the US with only this cursory level on anonymisation, which had not been reported on.
The final Q&A session included a response by the information commissioner highlighting an upcoming anonymisation seminar to address the issues Caspar Bowden had brought up but that also they the ICO role to enforce the law as it currently is, rather than as they might wish it to be. There was a discussion about the level of consumer awareness of privacy issues, and how this might be rising (although perhaps not with young people). There was also discussion about the relative weight and importance (or lip service) given to data minimisation. ICO identified child protection as a very important area for them. Finally, there was discussion about the difference between EU directives (which require interpretation into national law) and EU regulations (which become applicable in UK law directly).
The Earl of Erroll concluded the session with a suggestion that there was probably no solution to these issues, but that we had to work at them anyway, and there was a dynamic tension rather than a balance. The speed of change was just starting, and we were always going to have both too few and too many rules. He again repeated his call for increased tolerance as we increasingly learn more about one another that we might otherwise have liked.