Friday 28 August 2009

Identity in the Information Society (1)

Very late, but better than never - Some notes from a workshop I attended earlier in the year.

2nd Multidisciplinary workshop on Identity in the Information Society, London School of Economics, 05/06/09

Kevin Bowyer – Notre Dame – what happens when accepted truths about IRIS biometrics are false?

Showed how some of the accepted truths about iris biometrics are false, but that this will not hugely change the field. Post 9/11 there was an inflation of biometrics. Showed a video of the use of Iris biometrics by the UNHCR in Afghanistan to prevent aid recipients making several claims.

Gave a good account of how Iris biometric technology work. Including taking a circular image and making it a straight line that can be digitised and turned into a code of 1’s and 0’s – it should be the case that everybody’s coding is statistical different. These images use near-IR illumination and therefore look quite different to normal light photos (this is because some people have dark irises hard to distinguish from the pupil).

Governments are interested because of claims to massive accuracy of this technique, with small error rates, and these extreme claims about performance appear to have a good theoretical background. However, there is often a confusion of types of errors (which may be the result of advertising copy written by marketers rather than scientists or engineers). To be able to tell how accurate the biometrics are, need both match and non-match distributions.

(How often do you match when you shouldn’t, or don’t match when you should)

Comparison of two images of the same eye, never give zero difference. There is some level of difference due to engineering and control of the environment. Engineering decisions place a threshold between two types of error distributions. An equal error rate is the threshold with the same numbers of errors on either side. So you essentially have to trade off the two types of errors.

Then Kevin got stuck into several accepted truths in Iris Biometrics.

· Pupil dilation doesn’t matter (greater differences increase the match rate)

· Contact lenses don’t matter – you can wear them or not (contact wearers about 20x likely to be false non-match e.g. don’t recognise you as you)

· Templates don’t age – you can have one enrolment for life (enrolment becomes much les accurate the longer enrolled, with a measurable increase in non-match frequency)

· It’s not a problem when you upgrade your sensors

Kevin identified that that was approximately, a 1 in 1.2 million chance of a false match. However, this was for a zero effort imposter, randomly chosen. It did not include anybody make any effort to try and beat the system.

He also asked why biometric systems did not automatically update with every successful access. For example, after having enrolled in a building access system, if the system recognises me as me (and lets me into the door) then, if you have confidence in the system’s accuracy, why shouldn’t the picture taken then replace the one in the database?

He also asked which problems does the government plan to solve with biometrics – ease, access or security? As this will affect the design of any future systems.

Roger Clarke – a sufficiently risk model of (id)entity, authentication and authorisation

Identity expert Roger Clarke presented his system attempt to rework the vocabulary used in identity and identification issues. This was an attempt to be positive rather than simply critical. There is a need for a deep technical discourse, and our ordinary language terms are not sufficient for this, because of the baggage and multiple, unfixed meanings that they carry. He was frustrated by the language, and wanted something internally consistent and useful for analysis purposes.

It includes 50 concepts. (my personal favourites are 'entifier' and 'nym')

I think I was sceptical about the assumption in this that there was/is a ‘real’ identity that language is confusingly covered. I’m also cautious that this would result in a really technical jargon that would be so distanced from ordinary usage that it was elitist and inaccessible by ordinary folk. The way that people think and talk about identity is important. It is also, as Clarke points out, messy and confused. This confusion causes some problems, but I don’t think that the response is to create an entirely new language. This is opting out, rather than being engaged in a discursive environment.

Seda Gurses (Leuven) Privacy Enhancing Technologies and their Users

(research page)

Seda set out the story of privacy in computer science since the 1970s/80’s retelling a story from surveillance studies as a software engineer. She claimed to be nervous beforehand, which didn’t come across at all. For software engineers, security is confidentiality, integrity and anonymity.

She argued that PETs (privacy enhancing technologies) were in general, poorly named. They are mainly anonymity systems, aimed at making the individual indistinguishable in a set, using a probabilistic model. PETS are based upon technocentric assumptions, not solving all privacy problems, but they are still essential. They are technocentric in that technology leverages a human act, it performs an instrumental function, the technology thought to be is exogenous, homogeneous (assumed to work everywhere), predictable, stable, and perform as designed.

PET assumptions are that 1) there is no trust on the internet, 2) users are individually responsible for minimising collection/dissemination of their data, 3) if they know your data, they know you, 4) collection and processing of personal data have a chilling effect, 5) technical solutions are preferred to a reliance upon legal solutions.

Seda countered this by drawing in a surveillance studies perspective, of networks, categorisation and construction and feminism. Data receives meaning through it’s relation to others. It is a creation of knowledge of a population. Statistical data reveals about individuals who don’t participate in data revelation (see Wills and Reeves, 2009 for more on exactly this). Social network structures are more difficult to anonymise so the very idea of individual responsibility is problematic (data is not private property?) but it is difficult to create collective counters. She drew attention to the idea of a digital commons.

Thursday 27 August 2009

Talk at Royal Holloway

I have been invited (and accepted) to give a talk at a workshop being held at Royal Holloway, University of London on the 15th of September. The workshop is the conclusion of the two year, ESRC funded study of radicalisation and the media and should be rather interesting.

I'll be talking about terrorism and its role in driving surveillance, primarily drawing upon issues of identity and identification. This will include identity cards, but also identification in the banking and financial sector, linked to anti-terrorist financing measures.

The threat of terrorism was one of the early stated reasons for the introduction of the identity card, this has largely been dropped, but not entirely. The focus is now more upon ‘terrorist use' of multiple identities, and blurring the lines between terrorism and organised crime. This comes out of my discourse analysis research, which studied government publications as well as news media accounts of surveillance – identifying particular frames through which surveillance technologies and practices are portrayed. The figure of the terrorist looms large in these frames.

Wednesday 26 August 2009

watchtower surveillance


At the top of a tower at warwick castle. Technically it's always been defensive, surveillant piece of architecture. You get rather good views over the surrounding county.

Tuesday 25 August 2009

Recasting Power

I've just been invited to the above. A debate organised by Channel 4 about how the internet and various other digital things have, and are, influencing politics and the distribution of political power. It's taking place in Birmingham on the 17th of September. September's filling up with interesting activities, including a talk in London about David Lyon's new book, an identities conference in Oxford, a workshop of security studies at POLSIS and possibly a workshop on radicalisation at Royal Holloway.

US military Cyber Force Activated

Register article

The US air force held an activation ceremony in Texas yesterday for its new cyberspace combat unit, the 24th Air Force, which will "provide combat-ready forces trained and equipped to conduct sustained cyber operations".

The 24th will be commanded by former Minuteman missile and satellite-jamming specialist Major-General Richard Webber. Under his command are two cyber "wings", the 688th Information Operations Wing and the 67th Network Warfare Wing, plus combat communications units.

Ces Moore and I have been writing a bit about cyber-warfare of late, and I'll be doing more of it to cover the securing virtual spaces theme of the Space and Culture special issue. I don't actually like the term 'cyber-warfare' as it sounds like something Tom Clancy made up.

We had a look at Russian information operations and network warfare in relation to their operations in Chechnya for a contribution to a forthcoming book edited by Asaf Siniver here at UoB. It's a tricky field to get a handle on, given a lot of it is fairly arcane and deliberately hidden. That said, there are some contributions to political science and IR (especially security studies) that can be drawn out of it. In that article, we basically wanted to highlight the use of information warfare techniques as part of a counter-insurgency campaign against groups that were themselves fairly technically literate, and the combination of information attacks with physical attacks of the traditional lethal type.

ESRC / Surveillance Studies Network Seminar Series The Everyday Life of Surveillance

(I can't make this, but it is free to attend. Contact Kim McCartney - kim.mccartney@ncl.ac.uk

Seminar 5: Architectures, Spaces, Territories September 1st, 2009 @ Culture Lab, Newcastle University, Newcastle upon Tyne, UK.
http://www.ncl.ac.uk/culturelab/

9.00am registration for 9.30 am start. Finish by 4.30pm


Outline:

The fifth seminar is the series will bring together both the ‘virtual’ (computers, telephones and the Internet) and the ‘material’ (buildings, neighbourhoods and cities). It will think about how these are increasingly merging and being subject to surveillance in the same or similar ways as computing is built into everything, including potentially, ourselves. The day will concentrate on the spatial and territorial aspects of surveillance in a world of global flows of people, things and information, and of pervasive computing technologies. This will bring together both virtual and material ordering in consideration of ideas of speed, post-territoriality, code, protocol and so on. It will cover forms of monitoring and control as ways of shaping the physical and virtual architecture and landscape (or flowscape) of private and public realms at multiple scales.

The seminar will consist of 3 dialogues between an exciting line-up of 5 invited speakers and the host, Martyn Dade-Robertson, Lecturer in Architecture at the School of Architecture, Planning & Landscape, Newcastle University, UK.

Speakers:

Malcolm McCullough. Associate Professor in the Taubman College of Architecture and Planning, University of Michigan, USA. Malcolm is an architect and author of Digital Ground: Architecture, Pervasive Computing and Environmental Knowing (2004, MIT Press). http://www-personal.umich.edu/~mmmc/

Jaime Allen. Lecturer in Digital Media and Deputy Director of Culture Lab, Newcastle University, UK. Jaime is a new media artist and developer whose work can be seen at http://www.heavyside.net/

Martin Dodge. Lecturer in Human Geography in the School of Environment and Development, Manchester University, UK. Martin is the author of several books on the mapping of virtual spaces, including (with Rob Kitchen) The Atlas of Cyberspace (Addison-Wesley, 2001) and is now interested in mapping data shadows.

Nikki Green. Senior Lecturer in the Sociology of New Media and New Technologies in the Department of Sociology, University of Surrey, UK. Nikki is a sociologist of communications technologies, has worked on projects with BT and Intel, and is author (with Leslie Haddon) of Mobile Communications (Berg, 2008). http://www.soc.surrey.ac.uk/staff/ngreen/index.html

Marc Langheinrich. Assistant Professor in Computer Science at the Università della Svizzera Italiana (USI) in Lugano, Switzerland. Marc is a former developer who was involved in the Disappearing Computer initiative and the EU's Safeguards in a World of Ambient Intelligence (SWAMI).

Lessons from the Identity Trail

A fantastic little discovery this morning - I found that the book 'Lessons from the Identity Trail: anonymity, privacy and Identity in a networked society' is downloadable, chapter by chapter under the creative commons license.

Alternatively, you could buy the tome. This is a pretty substantial research project.

During the past decade, rapid developments in information and communications technology have transformed key social, commercial, and political realities. Within that same time period, working at something less than Internet speed, much of the academic and policy debate arising from these new and emerging technologies has been fragmented. There have been few examples of interdisciplinary dialogue about the importance and impact of anonymity and privacy in a networked society. Lessons from the Identity Trail: Anonymity, Privacy and Identity in a Networked Society fills that gap, and examines key questions about anonymity, privacy, and identity in an environment that increasingly automates the collection of personal information and relies upon surveillance to promote private and public sector goals.

This book has been informed by the results of a multi-million dollar research project that has brought together a distinguished array of philosophers, ethicists, feminists, cognitive scientists, lawyers, cryptographers, engineers, policy analysts, government policy makers, and privacy experts. Working collaboratively over a four-year period and participating in an iterative process designed to maximize the potential for interdisciplinary discussion and feedback through a series of workshops and peer review, the authors have integrated crucial public policy themes with the most recent research outcomes.

Monday 3 August 2009

(pre)Iraq cyberwar plans

http://www.nytimes.com/2009/08/02/us/politics/02cyber.html?ref=technology