Tuesday, 23 February 2010

spy laptops in schools

This is in the news a fair bit, and here's some good investigative work on it.


yet more surveillance of kids (from boing boing) on 'how google saved a school' - egregious surveillance at about 4-5min into this clip


Monday, 22 February 2010

Westerminer Legal Policy Forum - comment

[after the conference last wednesday, every delegate was given the opportunity to contribute a short article to the forthcoming publication. I wrote and submitted the following]

Privacy is a social good. Whilst privacy is important for the individual in terms of self-development, autonomy, personal dignity, and the exercise of fundamental human rights, this is not the extent of the concept. We are increasingly becoming aware that a lack of privacy, increasing surveillance and a vulnerability to a wide set of social harms arising from poor handling of personal information handling have negative social impacts. This can include chilling effects in political debate and social life, a loss of a presumption of innocence, and a reduction in trust towards government and institutions, including the private sector.

As Charles Raab argued, understanding privacy as a social good means that we can then look to evaluate one set of social goods against another, rather than weighing individual rights against social good and inevitably finding the individual interest lacking in weight.

However, mechanisms for mitigating harms focus upon the individual. As James Backhouse showed, responsibility for protecting their personal information is increasingly being placed upon the individual, exhorted to conduct themselves appropriately so as to mitigate the worst of a wide set of information harms.

Ongoing academic research suggests that many ordinary people are not well placed to secure or manage their own personal information. They are positioned in an information architecture determined by more powerful actors, over which they have little control. The actions they are encouraged to take, such as shredding personal information, or protecting their PIN, are of little impact when information about them is lost from large databases or traded around the world.

Engaging the public is important, but is must be done in terms that make sense to the public. Anna Fielder correctly suggested that language is of fundamental importance here, as is identifying how people think about their privacy and their personal information in the real world, outside of policy and technological circles.

Understanding personal information as a property of the individual, as seemed to be at the heart of Conservative party proposals to roll back the surveillance state, is also problematic. This continues the individualisation of privacy and ignores the social harms. It also misses the increasingly relational nature of privacy, as is made most visible in online social networks. We are more and more digitally interconnected, and this leaves traces. What I may choose to reveal about myself can expose those to whom I am connected to information harms.

Furthermore, the proposals do little to address surveillance and information insecurities arising from the private sector, a significant actor in contemporary surveillance. Privacy harms are currently often an externality that can be ignored by business.

It is possible to design information systems that are privacy and personal information protecting by design, or that allow individuals the ability to choose the degree of personal information they reveal. However, there is currently relatively little incentive for the private sector to invest in research and development of these technologies. There is a role here for government and legislation to encourage the development and implementation of privacy protecting information systems and practices.

Update - this was published, and part of the above was quoted by the LeftCentral blog

Thursday, 18 February 2010

Westminster Legal Policy Forum

Westminster Legal Policy Forum
Surveillance: use, effectiveness and enforcing data protection
Wednesday 17th February

I was at this event yesterday.
Transcript of all sessions released in a week. So this is preliminary notes and thoughts, so excuse the slightly scrappy nature in places.

Philip Virgo – EURIM – Chair
Only a 1/3 of population care about privacy (but they do care)
Simple policy area if understand that technology doesn’t work, nobody reads policy, most will accept defaults, and that nobody is telling the truth even if they knew what it was.

Francis Aldhouse – [ICO has largely accepted the Lyon definition of surveillance, which is a sign of a fair bit of academic impact. In terms of purposes this covers pretty much any deliberate purpose for information gathering.
Fairly standard account of what the surveillance society is.]
Technology plays a facilitating role, and allows changes of scale to the extent that they become changes in kind. Technology is new, but the social psychology is probably not.
Desire to generate trust, by knowing something about our neighbours
Behavioural targeting in advertising is equivalent of 1990’s one-to-one advertising.
Not a wicked conspiracy
Intention is to improve the life of community (law enforcement, anti-terror)
Problem is public sector – ‘risk paranoia’ which has become ‘today’s normality’ (quoting McNulty) – encouraged to approach government through risk, but this confused with likelyhood. We should asses our response to risks, and be proportionate.
Bads – inherent invasion of privacy, reduction of autonomy (control of self) of individual, chilling effect and reduction of trust, presumption of guilt, social fatalism, and threat to constitutional order (based on executive self restraint).

Charles Raab – 5 minute provocation.
1) To what extent are surveillance methods assisting with prevention of crime? – we don’t know, and we don’t agree on how to find out. There are studies, but assessment methods rarely comparable or consistent. Questions about samples and statistics, uncertaintly and arguments about interpretation (there are no disinterested parties). There is a lack of counter-factual examples, and it is difficult to isolate effects of particular technologies. Not easy to vouch for the independence of studies, sponsors have states, biased questions and self-serving.
[making an implicit argument for an academic role in this space]
Too often a mixture of myth and salesmanship
No ready answers on how to change evidence base. Question - what is a success in the prevention of crime? Should we settle for something pretty good, if it preserves other values we care about?
2) Issue of individual privacy verses public security in balance (and language of balancing), places the citizen versus the state. To easy to fall into balancing and reconcilitation of interests in this manner does say what the process of balancing is, how it is calculated, who pronounces the balance, who shifts from a previous balance, what happens in a disagreement (which is bound to happen). Highly political and full of conflict. Concern that nobody is working on/talking about this. No coherent answer to have practitioners arrive at this, and what the institutional and constitutional framework of this should be.
[this is a great question that should always be asked in ‘balancing debates’]

Nick Gargan – National Policing Improvement Agency, but speaking as ACPO lead on Intelligence. ‘Surveillance society’ is played out as an alarming narrative, and a gift to the media, and the government needs to rebut and correct much of this for clarity. Need numbers, because the commonly quoted ones are wrong. E.g. CCTV from two streets. Made an attack on academic work on surveillance as poorly researched, frivolous and alarmist.
[however, in quoting a more accurate figure of CCTV he only included local authority CCTV systems, which is quite disingenuous).
ACPO favours a consistent narraritve that sees the joins between technology (rather than the home office approach which looks at a technology-by-technology basis) and develop and evidence base. Police should be responsive, rather than determining policy. Up to community to decide if they want the police to follow footsteps online. Police service should provide dispassionate analysis of risk.

Isabella Sankey (Liberty) – not ‘for liberty, against security’ and recognise that they’re often hand in hand. [does this sublimate liberty to its propaganda value to security?] criticism directed at non-purpose databases (NIR). Two factors – war on terror and explosion of technological capacity. An unprecedented focus on security, and nothing to hide, nothing to fear
Privacy is valued culturally, but with little legal protection. There is an inherent link between privacy and human dignity. Human rights meaningless without autonomy, link with freedom of expression and non-discrimination. Surveillance is the issue over which the public most contact Liberty’s. Data loss scandals bring home the risks.
DNA database, S & Marper case
Need for regulation of CCTV
Practical and principled case for proportionate response.

Peter Mahy (Howells Solicitors)
Represented S & Marper
In 2008, the UK found to have overstepped its legal oblications on the retention of the DNA of non-convicted individuals on the national DNA database, that this is not necessary in a democratic society. The Government has apparently not so far complied with the judgement.
Doesn’t trust government to safeguard this info, they have an appalling record, and are considering sharing this information beyond the UK.
People want to be respected by the state, trusted as individuals. 1 million or so innocent people on database – imagine the resentment. Argued there was a perception of collection as secretive, covert, unjust (and arbitrary) – policing by coercion rather than policing by consent. In medical field, retention without consent abhorrent, why not in policing).

Dominic Connor – P&D Quantitative Recruitment.
Argued that data protection guidelines too low a standard, but that couldn’t/wouldn’t delete information at request of individuals. Argued not physically possible {which prompts the question of why the system was/could be designed in that way). People still believe they are hard to find, surprised when not. Critiqued Gargan’s 60,000 CCTV figure by pointing out importance of CCTV that police have access to. Public opinion the big driver to good info practices in private sector [but missed out the fact that abuses have to be revealed for this to work this way]. A government can do insane things without loosing votes.

Virgo asked about RIPA – investigative powers and processes
Gargan – positive, most useful when invention actions planned, and seen by senior people, and fairly visible.
Sankey – judicial authorisation rather than home secretary for interception warrarnts, RIPA should be re-evaluated, who has access, to which powers, and with what safeguards.

Caspar Bowden got into an argument with Dominic Connor, and said Gargan’s point about not lobbying was disingenuous given that ACPO and the NPIA had been lobbying ministers. Gargan responded that perhaps in the past, the relationship had been too close.

David Blackwell – Detica – asked Sankey if collection of data was troubling, or the potential misuse of the data. Sankey’s concern was about what was collected and how, collection without clear purpose (just in case) and that context was important for justice.

David McNesh (university of leeds) said he was working on new balancing ideas (Raab’s response was that he was calling for more awareness on the part of those who asses balance and how this is argued). He also asked what frameworks of ethics panellists used to determine just use from abuse.
Gargan – individual inspectors training, control against internal culture, exposure to external thinking, oversight and insight. Appling the will of parliament in a sensible way.
Connor – wondered if firms should actively assist the police above and beyond being presented with a warrent for information, and at what level of suspected criminality should this occur.
Mahey – different forces have different approaches, and is suspicious about ‘inspector judgement’ – UK courts have got it wrong (EU says so) so not good enough to leave judgement.
Francis – courts do not balance – they decide who has won. It is not a question of personal ethics, but appropriate social rules and depends upon social, legal, political contexts of decisions.

James Backhouse – LSE
Information risks faced by citizens in rise of digitial state are increasing, and responsibility for these risks is being moved onto citizens.
Joining up across state to an integrated identity profile
Citizens cannot change provides if unhappy with services
Personal data and identity at the heart of the digital state.
Assumption, if data there, why not use it?
FIDIS study – 2007 survey
Systems and technology, incompetence, public info managmenet failures, internal failures (oversharing) control, accuracy, transparency, risk of overpowering state.
Used international examples of E-ID cards
Information security has become information assurance (from safe, to ‘we’ve done our best’)
New individual responsibility (spelled out by get safe online)
But citizens have little control
Can’t make informed decisions about risk, set limits, choose options/exposure to address risk.
Not possibly to address identity without authentication + security – especially in the digital age.

Jonathan Bamford – information commissioner’s office.
Power of information technology much more than when early data protection
Risks to individuals, vulnerable, info used in unwarranted ways
Surveillance beyond simple abuses - personal information, and dataveillance
Footprints build up to details, intrusive piscture of how live our lives.
Most of state data about normal everyday people
Tech advance allows more profiles than ever thought possible when legislation developed (when he was a ‘boy data-protector’).
This will get bigger, quicker and linked.
Most of ICO complaints not about surveillance, but use of personal information
Now, more hardwiring of surveillance into society – eg. Licensing conditions, authorities given powers and overstep them.
Can’t be ludditie, IT revolutionises and can provide better services, better safety.
Risks – hidden, unacceptable, detrimental, mistakes, discrimination (social sorting, exclusion), suspiscion and greater intrusion.
Is Data protection still relevant
Talked about ICO’s annual tracking research
People expect safeguards to be built in by those building the system, as in cars with airbags, brakes. People are ok with use of data, but believe there is/should be a guardian angel to look after them.

Friday, 12 February 2010

The Phising flow chart (phlow chart?)

Found on website Login Helper, this is the phishing email flowchart, a companion to their how to identify fishing attacks guide. It handy in that it's visual, but I'm not sure it's the most elegantly designed version it could be. Having this material in a flow chart (decision support framework at all?) might be one useful way of visualising online privacy protecting behaviour, and general advivce/guidance.

(I found it on LifeHacker.com, highly recommended)

Wednesday, 10 February 2010

labour party hit by ICO

The information commissioner's office has served the labour party with an enforcement notice, after an investigation revealed that they had used automated marketing phone messages to contact people without their consent. Press release from ICO here. ICO considers political communication as marketing and covered under the same law. Interestingly, it looks like the labour party used lists of phone number from commerical sources.

(not half-inched them off facebook then? missed a trick)

Tuesday, 9 February 2010

Safer Internet Day and PAT research project

Following on the heels of the globally celebrated Data Privacy Day (see below) come the European Safer Internet Day.

Also, I've just some across an interesting research project (funded under the EU's framework 7 funding arrangement. the PATS project - privacy awareness through security organisation branding.