The Application and Management of Personal Electronic Information

Posted: November 10th, 2009 | No Comments »

Recently the First International Forum on the Application and Management of Personal Electronic Information, organized by the MIT SENSEable City Lab, gathered many stakeholders from multiple disciplines to share on the issues surrounding the application and management of personal electronic information:

The goal of this forum is to explore the novel applications for electronic data and address the risks, concerns, and consumer opinions associated with the use of this data. In addition, it will include discussions on techniques and standards for both protecting and extracting value from this information from several points of view: what techniques and standards currently exist, and what are their strengths and limitations? What holistic approaches to protecting and extracting value from data would we take if we were given a blank slate?

Luckily, many of the position papers and presentations are now online.

Several contributions look at other field such as health care to draw best practices of personal logs storage and mining. Particularly, in Engineering a Common Good: Fair Use of Aggregated, Anonymized Behavioral Data, Nathan Eagle argues for the necessity of a set of standardized protocols for behavioral data acquisition and usage to preserve both individual privacy and value of the community. Nathan has been analyzing behavioral data from mobile phone operators to help epidemiologists modeling human movement to support the allocation of malaria eradication resources in Kenya. With similar data, he supported planners of Kigali in quantifying the dynamics of slums and the social impact of previous policy decisions ranging from road construction to the placement of latrines (see Artificial Intelligence for Development). Still there are two major issues in the use of these data even if anonymized and aggregated:

  • Deductive disclosure: the nature of behavioral data is such that very few observations are required to deduce the identity of an individual. An issue that is overcome to some extent by strict data sharing protocols that ensure the data cannot be released to the general public. Other strategies my apply to some extend as well (see On Locational Privacy, and How to Avoid Losing it Forever and Jon Reades’ Using Finite State Machines to preserve privacy while data mining the cellular phone network)
  • Data retention and erasure: the inability of individuals to remove their data from these aggregate datasets. Good practices can be gained from the medical community that pushes for legislation enabling individuals to own their personal health records to prevent this type of exploitation. Similarly, there is also pressure for legislation on the ownership of personal behavioral data, providing individuals with the right to access and remove their data from corporate databases enabling them to ’opt-out’ from any type of analysis. This leave me wondering to what extend the opt-out impairs the quality of tha data?

Despite the necessity of rigorous data-sharing protocols, Eagle also considers of intellectual property of data can be considered as a form of intellectual property.

The behavioral IP of an individual should be owned by that individual, and licensed to third-parties for a fee if desired. The behavioral IP of a society should be considered as a valuable public good.

This certainly opens new interrogations on the applicability of this proposal (e.g. who determines the fees, who has access for free and who does not?, how to finance the efforts that transform data into valuable public good? are the developed algorithms also a public good?). In addition to discussing the IP of the data, I often argue on the necessity to apply transparent processes in which everybody is aware on the mechanisms to generate the information (see my World Information City Doggie Bag).

On that very aspect of data process (and its transparency), I was intrigued by Trevor Hughes’ (Executive Director, International Association of Privacy Professionals) intervention on “Data Environmentalism” that argues that we should focus less on “notice and choice (fair information practices) and actually put our efforts in in securing data, data flows, and legitimate use, to the point of developing Indicators of trust Transparency.

Another aspect on the exploitation of personal electronic information lies around the notion of dream of the perfect technology and the myth of the perfect power (see Stephen Graham at World Information City). It is one of the theme that Aguiton et al. cover in their contribution Living Maps: New Data, New Uses, New Problems quoting Bruno Latour in Paris: Invisible City:

Megalomaniacs confuse the map and the territory and think they can dominate all of Paris just because they do, indeed, have all of Paris before their eyes. Paranoiacs confuse the territory and the map and think they are dominated, observed, watched, just because a blind person absent-mindedly looks at some obscure signs in a four-by-eight meter room in a secret place.

On the application front per se, it is very well worth checking the recent research of Skyhook Wireless on their own data (Aggregated Location Requests) to perform time/space based analysis, frequency/phase domain extraction and baseline/anomaly detection.

Talk at Lift@Home in Geneva

Posted: November 10th, 2009 | No Comments »

Yesterday, I delivered my last formal talk of the year at the Lift @ home session on “Urban informatics / Les nouveaux paysages numériques“, organized by Nicolas Nova in the Lift Conference premises in Geneva. This event was part of the urban informatics workshop series Nicolas and have been running. I played the role of the utilitarian to engage the audience on the potentials benefits of exploiting the logs of digital activities in our contemporary cities. My established spiel was enhanced with some insights from a recent study of crowd dynamics at the Puerta del Angel/Rambla area in Barcelona. As usual, the slides of “L’analyse urbaine à partir des activités numériques” are online for your downloading pleasure.

It was a pleasure to finally tag team with Boris Beaude from EPFL who brought his geographer’s reading of the notion of digital spaces and the maps they entail (read “Internet, un lieu du Monde” in the book L’invention du Monde, and see his courses at SciencePo Enjeux politiques de la géographie and Théorie de l’espace at EPFL). His insights help raise the kind of reflexive awareness need to reduce the effect of map designers’ personality/background on what is finally produced (see his recent paper Crime Mapping, ou le réductionnisme bien intentionné). He delivered a compelling argument on the reductionism of crime maps visualizations, highlighting the classic misleading error of calculating the density of a phenomena from the density of residents. Furthermore, these representation rely on citizen’s declarations, while it is well known that the most dangerous areas of a city are where there is a fear to report crimes. Among other issues, this calls the attention on the lack of critical thinking on “what does this information informs us on?” and who is responsible for the mishandling and misrepresentation of the data?

Lift Workshop @ Lift office
Boris Beaude at the improvised cabaret in the Lift Conference premises

The third speaker, Pascal Wattiaux discussed the role of technologies in the production of the olympic games. Each of the project run for at least 10 years, with each candidacy strongly embedded into the city planning, compressing 30 of development into roughly 7 years with no escape and a constant acceleration and organizational ramp-up (growing from 350 to 150.000 people in a few years). The games experience goes from the preparation of the games, through the production of the games, and the legacy of the games. It must be in sync with the expectation of the various stakeholders (public, athletes, workforce, sponsors, municipal, regional and state governments, etc).

In that unique context, technologies constantly offer opportunities in both revenue opportunities and cost savings. However, with the constant evolutions of technologies, it is hard to build “best practices”, therefore organizers report on “best experiences”.

Nowadays there are opportunities in the analysis of the spatial dynamics of the organization, could improve the spectators management (the stadium need to be full, it is a question of image), reduce the staff of volunteers, or organize the emergency operations with specific language competences.

How do we Avoid a Digital Dump in our Backseat?

Posted: November 4th, 2009 | No Comments »

So yes, cities are all about difficulty. Most of the urban systems designs and scenarios out there are about to reduce their complexity adding layers of technologies and information (see The “Quants”, their Normalizations and their Abstractions). But we will never design complication out of the world, and certainly not with all the kind of instruments, practices and objects we develop. It tried to make that case in Embracing the Real World’s Messiness and in Sliding Friction: The Harmonious Jungle of Contemporary Cities. In their recent Situated Technology pamphlet, Julian Bleecker and Nicolas Nova argue on this quite nicely:

The idea of a ubiquitously computing urban setting where everything functions perfectly won’t work. We don’t even have to give the technical reasons why, we can rely on the history of failures as one often does, the things that are too often forgotten about but provide the richest set of materials for design and, despite this, are almost never considered.

At best the difficulties will shift. So is the design of urban systems about reducing complexity or make cities less intimidating? In an inspiring “melt up”, Adam Greenfield went off script and argued for the latter. Urban system are about giving more visibility to engage, to have citizens a little bit more prepared to understand the complexity and at best participate to the conversation that is the city. The design goals are both very humble and yet extremely challenging to reach. It will request us to disable the way we do design. That is, user-centered design is not enough and we have to go out and get dirty (e.g. practicing urban scouting, confronting practices or as Adam would say “Go beyond the safety nets of the practices we use”).

But even with perfect designs, the city has no guarantee of perfect outcome, citizens do appropriate the resources in an equal manner, some take advantage of the systems and have the ability to break the rules. So, as asked by an attendee to Adam’s talk, “Can we avoid a digital dump in our backseat?“. I have no real answer to that (great!) question, but it seriously questions the way that we contextualize, design and plan the integration of urban systems into contemporary urban environments. In the series of workshops on urban informatics Lift lab leads we often ask participants to “criticize” their scenarios/interventions with considerations on the (basic) implications on the different stakeholders (“win” and who “loses”?). One outcome of our resent workshop in Cornellà proved that it takes a long effort to go beyond the pretty and inclusive designs of urban system or the scenarios that discard the nasty elements that are integral part of urban life. It was not until the real end of the workshop that conflicting debates emerged.

Why do I blog this: near-future networked and digital cities are also about: Brussels’ digital garbage collectors going on strike, an alarming rate of digital syllogomania among the registered citizens of Sao Paolo, Google fined by the EU for their open data spillage of Amsterdam and Tokyo’s mayor who has to resign for sensor data smuggling. The contemporary Paris Ideal of Bicycle-Sharing Meeting Reality is yet only a weak signal.

Even with a perfect design, the city offers not guarantee of perfect outcomes

The "Quants", their Normalizations and their Abstractions

Posted: November 3rd, 2009 | No Comments »

In the latest Situated Technologies pamphlet “A synchronicity: Design Fictions for Asynchronous Urban Computing“, Julian Bleecker and Nicolas Nova discuss the notion of “real-time cities” shifting discussion away from the hygienist model of efficiency towards unscripting the unexpected and cultivating the unusual. In a world of “open data initiatives” and “smart cities”, I have a lot of sympathy for their discourse that consider computing in an urban setting not be about data and algorithms, but people and their activities. They critique the hold of quants on the representation of the city arguing that “our relationship to the spatial environment should only be based on statistical analysis or mediated by computations”:

One characteristic of these sorts of mass city visualizations is that they operate at an abstract level and normalize the individual, averaging out all the atomic units—the people—of contemporary cities. Another dimension that is lost is the history and culture, which are not part of these representations.

Of course, the “quant” failure in the financial markets makes this idea of our reliance on spreadsheets, quantification and computation even more poignant.

And these numbers guys on Wall Street—the “quants”—were going berserk with their numbers. They were creating such byzantine computational number-crunching algorithms that no one knew how it all worked. The quants, with their theoretical mathematics PhDs, had so divorced themselves with their abstracting tier of calculation that it all was destined to collapse.

In my thesis, I intended to downplay the role of data and the unique reliance on data scientists, arguing for mixed quantitative and qualitative approaches to capture urban dynamics and support the design of urban services (see The Other Point of View). It also implies integrating other practices to question the hold of engineers, accounts and architects on the design of our cities. Julian and Nicolas use the following terms:

I suppose this is where designers could participate if they sat at the same table as the engineers and accountants and brought additional sensibilities that can vector interpretations and semantics differently, away from the up-and-to-the right graphs of instrumental progression to bigger, faster and cheaper.

Analysis of Visitors from their Digital Activities

Posted: November 3rd, 2009 | 1 Comment »

Last week I was at Donosti-San Sebastian, to give a short presentation of my research at the The First International Conference on the Measurement and Economic Analysis of Regional Tourism. In the session, “New Instruments for Measuring and Modelling Tourism Flows”, I delivered my classic spiel “Analysis of Visitors from their Digital Activities” that covered:

  1. the ability to reveal aspects of visitors experience of a city/region from their digital activities
  2. the opportunities to evaluate urban strategies

I have added some notes and references to augment my slides.

Prior to my talk, Carlos Arce provided a complete scan on the new instruments and techniques to measure travel behaviors, mentioning the battles in persuading people and organizations to participate to survey and the necessity to “sell better” the value of these kind of analysis (impact, opportunity and efficiency; for special need population or special areas (eco-tourism)).

Following the other presentations, it seems there are not many innovations that can surpass the power of paper+pencil to measure travel behaviors. Back in the Simpliquity days, we inspired from this traditional technique to develop a very simple technological solution for Detecting air travel to survey passengers on a worldwide scale. This approach contrasted with the quest for perfect data some statisticians seem to lose themselves in, some requesting a mandatory Galileo reporting system for each vehicle in function in the EU. I mean, Europe can be more creative than that! Fortunately, some statisticians do not seem well-armed with a consistent argumentation to get what they do not have, considering the barrier they already face (privacy, propriety, silos, data quality, evaluation of their models). I particularly expected to experience more discussions on the transformation of measures and analysis into politics and strategies (and their evaluations).

Last week, Nicolas was also invited in a keynote address to discuss the near future of tourism services based on digital traces.


Thanks to CICtourGUNE and particularly to Ibon for the invitation!

It seems our work has inspired others very recently: Explorando otras fuentes de datos: Flickr y el turismo and Redes sociales y turismo: flickr + Canarias.