Friday, October 17, 2003

UbiComp Reflections I - context awareness and privacy

A recurring claim is that security and privacy are central to viable ubicomp. Since security requires the collection of huge amounts of data - think what is necessary for user authentication alone - we are faced with how best to collect, store and administer these data. For example, who will have access to this information? How long will it be stored?

While certainly these are questions of social importance, I don't think it is reasonable - or responsible - to continue developing and deploying these technologies with the attitude that these are purely "social issues" or "policy problems" that have nothing to do with the research and development of ubiquitous computing.

Technology does not exist separately from these issues and problems, and neither do those issues and problems arise only in the use of these technologies. We need to find ways to integrate these concerns into the research and design processes ...

Interesting technological takes on context-awareness and privacy discussed at the conference?

Tim Kindberg and Kan Zhang (HP Labs) work on using the physical properties of lasers to constrain communication channels and allow validated binding of encryption keys to devices for spontaneous and selective secure interactions. Neat-o.

Jeffrey Heer and colleagues in the Group for User Interface Research at Berkeley have developed liquid for context-aware distributed queries. They started from the premise that context data (people, places, objects, activities) are highly distributed and dynamic, and chose to push functionality from applications to infrastructure. Borrowing from database communities, liquid relies on distributed streaming queries and dynamic query re-routing in response to changing contexts. IMHO, this is a really interesting example of social computing: not only did they not model context as if it were immutable, they collaborated with other researchers. Well done.

But my favourite example of responsible computer science came from Carman Neustaedter and Saul Greenberg at the GroupLab - University of Calgary. They developed a privacy-aware camera system for home office workers by working with existing everyday privacy practices. This may seem obvious or overly-simple, but these guys really respect people: their technology built on - instead if interfering with or seeking to replace - verbal and non-verbal behaviours, environmental and cultural mechanisms. And the icing on the cake came during the question period:

Q: Why didn't you use two cameras?
A: Because people don't like lots of cameras pointed at them.
Q: Why didn't you hide them then?
A: Because we have a moral obligation not to do things like that.


Looking more closely along these lines:
Workshop on Ubicomp communities: privacy as boundary negotiation
Towards a Deconstruction of the Privacy Space (pdf), Scott Lederer, Jen Mankoff, Anind Dey
When Trust Does Not Compute The Role of Trust in Ubiquitous Computing (pdf), Marc Langheinrich

Update: Karen Lancel's Agora Phobia (digitalis) artproject is also very cool: it "invites the audience in a halftransparent, inflatable ISOLATION PILLAR / Free Zone. The Isolation Pillar is placed in crowded, public places like square, museum, theatre and is big enough for 1 computer and 1 person. Agora Phobia (digitalis) invites you in the Isolation Pillar / Free Zone to participate in an internet-dialogue with someone who lives isolated somewhere else, like: someone living in prison, someone who lives in a cloister, a digipersona, a pilgrim, a 'prisoner of war' (POW), somebody dealing with agora phobia."


Post a Comment

<< Home

CC Copyright 2001-2009 by Anne Galloway. Some rights reserved. Powered by Blogger and hosted by Dreamhost.