The Opaque Smart Grid

Posted: January 3rd, 2010 | No Comments »

The recent fiasco of PG&E smart grid installation in California provides some valuable insights on the integration of real-time meters of urban activities and people’s appropriation of the information fed back to them. Indeed, one of the advantages of a smart grid is that the two way flow of information communicates home energy consumption information and in return allows utilities to alert customers to real-time electricity pricing (see ‘Smart meters’: some thoughts from a design point of view). Yet, it seems that this mechanism failed in Bakersfield, CA, residents complaining that meters are logging far more kilowatt hours than they believe they are using. It is potentially due to technical reasons and certainly for the lack of transparency in the design of the system.

First, there is a lag between the installation of smart meters and the deployment of the in-home network that provides value to the consumer. The lag is so great that that the consumer started to feel disenfranchised. Indeed, “currently there are no in-home energy management displays or dashboards accompanying the new smart meters. Customers have no way to know how much their energy usage is costing in real time and… the utility does have plans to install these in the future” (see PG&E smart meter problem a PR nightmare).

Then, the system does not communicate its new rules. If customers do not shift demand to off-peak times when rates are lower, as argued by PG&E (see PG&E smart meter communication failure – lessons for the rest of us), then it means that the system fails to communicate the value of shifting demand or the time when rates are lower.

Finally, the rolled-out system is opaque in communicating its state. As exemplified in this tweet the design does not integrate failures and user inquiries: “I’m waited for PG&E to put up the daily usage numbers, I won’t get those until next month for some unexplained reason“.

A near-future evolution of smart grids, in a perfect internet of things world, is that consumers can “set and forget” to constantly monitor anything. Appliance makers such as General Electric and Whirlpool are developing smart appliances capable of doing the monitoring on the behalf of their owners. At what level will this extra layer of automation disenfranchise or empower us? How will the practice of organizations controlling hard infrastructure integrate the specificities of soft infrastructures.

Why do I blog this: Fascinated by the fact the roll-out of “smart grid” system, meant to empower both energy consumers and producers, led to people feel disenfranchised. The organizations that have controlled hard infrastructure for decades have still a lot to learn from designing their newly, internet-of-things-powered soft infrastructures. More than communicating accurate data, they are requested to design and make their overall process (e.g. data collection and handling) more transparent. The Bakersfield fiasco is an example of the new frictions PG&E and the likes will learn from as their are getting closer to people, metering their consumption on an hourly basis and feeding them back information in real-time. What kind of friction will occur when a governmental institution gets similarly more deeply involved in soft infrastructures, such as vehicle tracking initiative in Holland?

-173C
An armless erroneous temperature reading


The Cityscape as a Spectacle

Posted: December 28th, 2009 | No Comments »

The Cityscape as a Spectacle (@ Mirablau)
Mirablau at the bottom of Tibidabo offers a spectacular view over Barcelona. At sunset, the lights are soften to contemplate the change of colors of the city.

Why do I blog this: Working on a text on data cities and visualization. I communicate my work with visualization to play with this fascination of the macro views on city dynamics. Most cities offer their observation decks, being it natural or man made. They complement the citizens mundane micro observations of atomic level city dynamics (e.g. planes, road traffic, construction sites, …)

Plane spotting truck spotting the art of spotting


The Role of Architecture at the Time of Urban Informatics

Posted: December 28th, 2009 | 4 Comments »

The increasing presence of soft infrastructures embedded with the urban fabric is altering our experience of the city. Simultaneously, the manipulation and processing of the underlying data generated by networked and sentient systems offers new possibilities for architecture. Yet, there seems to be a gap in the practice of designing buildings and spaces in concert with their informational membrane; a practices that understands what makes good cities tick and knows the roles informatics can (and cannot) play; a practice between architecture of construct, architecture of information and industrial/experience/interaction design. The gap seems so wide that large software and hardware corporations are the sole actors that attempt projecting their visions and deploy their case studies of smarter/sentient/responsive cities. This year’s Toward the Sentient City organized by the Architectural League of New York has been an attempt of to fill the void and look at the possible future trajectory for architecture at the time of urban informatics. In his constructive criticism of the show, Dan Hill highlighted how far architecture has to go to stay relevant in the development of ’sentient cities’:

Architecture and urban design should be in this debate, no doubt, but its entire practice, sensibility and economic model may need redressing (as with many other fields, of course.) Given their previous predilections, the lack of technical and conceptual understanding – never mind an apparently congenital inability to design a decent website – the profession has a long way to go before it can demand a seat at the table. An admittedly fading tradition of thinking of itself as the ‘master builder’ needs to be entirely excoriated once and for all.

In a recent keynote address entitled “How can architects relate to digital media?” Mobile City’s Michiel de Lange and Martijn de Waal urged a parterre of young architects to “relate to digital media in a new way, beyond merely using them as instruments, to represent their spatial logic in design, or to design for virtual worlds“. They layout a couple of new directions in the evolution of architecture as a practice:

First, we already witness that the profession is flexibly adapting itself to new circumstances. Architecture is moving in the direction of what has been called ‘service design’. This means that a client hires a ‘designer’ not to just build him a beautiful building, but to shape a particular process or ‘customer (or ‘citizen’) experience’ from start to end. The question is how can these two structures – physical situations and media practices – be combined to design for urban experiences in meaningful ways? Surely this question cannot be solved by architects alone.

Second, architects harness spatial expertise that can steer future directions of new media. Digital media developments are increasingly being integrated with geographical space, physical context, and the material world (labelled geo-spatial web, locative media, the internet of things, and so on). We think it is important that architects play a role in the debate about the values that are implied in such media designs.

Architects can contribute crucial insights particularly on the non-digital modes of design for human experience (see Responsive Environments), the kind of insights based on historical context that other practices fail to grasp. For instance, Adam Greenfield recently discussed the ahistoricity of interaction design:

Let’s face it: brighter and more sensitive people than us have been thinking about issues like public versus private realms, or which elements of a system are hard to reconfigure and which more open to user specification, for many hundreds of years. Medieval Islamic urbanism, for example, had some notions about how to demarcate transitional spaces between public and fully private that might still usefully inform the design of digital applications and services. By contrast, the level of sophistication with which those of us engaged in such design generally handle these issues is risible (and here I’m pointing a finger at just about the entire UX “community” and the technology industry that supports it).

Why do I blog this: Currently helping setting up an event that look at the new roles of architecture and urbanism in the networked cities landscape. I am particularly fascinated by the gap formed by the lack of technical and conceptual understanding of many architects and the little presence of historical context and acquaintance non-digital modes of design in the designers/engineers practices. A serious need for more T-shaped people?


Barcelona, Reframing its Model as a Networked City

Posted: December 23rd, 2009 | No Comments »

Barcelona has long and unique urban tradition often described as the “Modelo Barcelona” (see El Modelo Barcelona: un examen crítico) that features a capacity to treat and revitalize central urban space with interventions at the scale of streets and places; mixing with large urban projects that favor the density and compactness of urban form (see Barcelona: The Urban Evolution of a Compact City and the Barcelona Regeneration Model)and affect the city as a whole. The most apparent trace of its application are the garden city expansion of 520 street blocks planned as long ago as 1859 by Ildefons Cerdà (Plan Cerdà) now forming the Eixample district and the use of the Olympic Games as a vehicle for city-wide reforms over four neglected urban areas. In the past decade, Barcelona has undergone a new wave of major transformation with the 22@ innovation area and the Diagonal Mar hyper-community constructed on coastal brownfield and reclaimed land. These evolutions have faced major criticisms to the points of wondering whether Barcelona was losing touch with its model that beyond strict urban planning is also associated with focus on civil society as the leading dynamic in the city’s evolution. For instance, Josep Lluis Mateo of the newly created Barcelona Institute of Architecture (BIArch) claims that the history of the practice of architecture in Barcelona is more advanced than the “Modelo Barcelona” and its recent use of iconic architecture, suggesting that forms matter less than space, material, light, sensations and logic (see “Reinventaremos el modelo Barcelona“).

While in the past architects and city planners might have argued and developed the model of Barcelona, new actors are emerging as driving forces of the evolution of the city, its infrastructure and quality of life. For instance, the innovation and eGovernment Department at Barcelona City Council has unveiled their “Smart City” model for using information and communication technologies to improve its residents’ quality of life and ensure more efficient and sustainable maintenance and management of big cities. Even if their “formula” is still fresh and goes through constant evolution it has reached enough maturity to contrast with very developed discourses on networked cities in Asia (with New Songdo as showcase) and the recent fascination of North America for open data initiatives (some carrying a similar tone of naiveness of the first wave of “Muni WiFi” projects).

In his presentation “Barcelona Smart City” (slides), Joan Batlle highlights the articulation of the four major connected elements of their model:

  • Ubiquitous infrastructures: city network, citizens’ access network (communicate and create information). Examples: Muni WiFi mesh network (680 nodes, 20 services, 500 free hotspots)
    Information: sensors, digital footprints, citizens’ information (raw material of the innovation factories). Examples: BitCarrier’s real-time traffic monitoring system, my visualizations of Bicing and Flickr data, aggregated mobile network traffic data. Joan rightfully questions “Can we use Directive 2006/24/EC to devolve citizens’ information to citizens? …and allow them to make services for the citizens (from the citizens)?
    Living labs, open innovation: citizens, companies, universities, city council. Example: Urban labs at 22@ as testing space for innovative enterprises, Living labs with its Media-TIC building.
    Smart Services: municipal advanced services, services for citizens from citizens. Example: iBicing for iPhone, and the Urban Mediator, one “click” services.

It will certainly be interesting to participate in the integration and interplay of this kind of Networked City model (extended by other initiatives such as Citilab) within the urban planning history of “Modelo Barcelona” that shows how the city has overcome major contradictions. Barcelona has the opportunity to be a leading city in that domain, with a sensitivity for its citizens, civil society and networks (a legacy of Manuel Castells?) over the concrete and the forms. It contrasts with many other European cities such as Paris and its famous Greater Paris design competition and Christian de Portzamparc’s project on efficiency and speed featuring some sort of paleo-futuristic Monorail.

Why do I blog this: My presence in Barcelona is partially the fruit of the 22@ urban project, because I preferred to pursue a PhD within the living lab and messy aspect of Barcelona rather than the green alleys of traditional university campuses. In return, my research projects are now presented as ground for potential future of the city. This is utterly rewarding and of course it incites me in intensifying my investigation in Barcelona.

I consider cities as being “smart” by default (isn’t it the human’s greatest invention?), so I do not support the contemporary discourse that information technologies and infrastructures will make them any smarter, also considering that IQ tests for cities still need to be developed. However I do believe they can help, with the support of proper models and processes, in making a city an even better place to inhabit.


Lift lab Now Also on the WWW

Posted: December 11th, 2009 | No Comments »

Earlier this year, Laurent Haug, Nicolas Nova and I co-founded Lift lab, an independent research agency that helps companies and institutions understand, foresee and prepare for changes triggered by technological and social evolutions. Since then, we have been very active developing our areas of actions from exploratory field studies to foresight research, applications prototyping and event-building in the domains of Web and Internet, video games, mobile and location-based services, urban informatics and robotics/networked objects. Our freshly launched web site exemplifies our services with case studies:

UNDERSTAND: We explore how people behave and interact with technologies in their environment, and use these insights to design better experiences. We rely on field research methodologies that enable clients to better understand their users. Case studies: McKinsey and ENSCI.

ASSESS: We assess innovation through product audits, reviews and testing and field as well as desk research. We then develop a detailed assessment of the project at hand based on our expertise and targeted needs. We finally suggest improvements and alternative solutions. Case studies: Swisscom and BitCarrier.

SHARE: Acquiring the right knowledge is the first step towards change, followed by spreading the word. We give lectures and run workshops on technology, innovation, design and social change. We also use our conference experience to organize private and public events for our clients. Case studies: TechnoArk and Alp ICT.

FORESEE: We map possible future changes to highlight new opportunities and prepare for them. We use futures research and tools to map emerging social and technological shifts and prepare for them. Case studies: Phoenix Studio, UBS and the French Ministry of the Industry.

CREATE: We create instantiations of possible near future applications. Based on prototyping methods, we make product ideas or insights coming from field studies materialize. Case studies: Swisscom and BitCarrier.

Designer Maja Denzer did a perfect job in conceptualizing and designing the site. It combines our own photos, focusing on details of integration of technological instruments and people in intriguing situations, with short sentences on “today and the future”, causing surprise, concerns or curiosity.

lift lab's hompy. Dec 11. 2009

Our friends from Bread and Butter designed the logo with the great Akkurat typeface

Logo-Liftlab-Noir-Sur-Blanc-1


Upcoming Lift 2010 in Geneva

Posted: December 7th, 2009 | No Comments »


Lift lab’s partners Laurent Haug and Nicolas Nova have launched the new website of the upcoming Lift 2010 edition in Geneva. The event will revisit the myths about connected people:

Lift10 will explore the most overlooked aspect of innovation: people. Known in the techno-parlance as users, consumers, clients, participants, prosumers, citizens or activists, people ultimately define the success of all technological and entrepreneurial projects. They adopt or refute, promote or demote; embrace, reject, or re-purpose. Their approaches are unique, influenced by cultural and generational diversity. A decade after the rebirth of user-centered design and innovation, it’s time to explore the myths and uncover the reality behind the “connected people”.

Also check the current speakers roster and the program format/sessions:

Generations and technologies
How to go beyond the usual clichés on generations, with Seniors unable to benefit from technology and Millenials ruining their future careers on social networks?

The redefinition of Privacy
What is privacy in the 21st Century? Is personal security threatened by the massive collection of personal data?

Communities
Since 2006 Web 2.0 has celebrated the so-called “amateur revolution”. What did we learn in the past 5 years? Are we reaching the limits of Web 2.0?

Politics
Beyond the much talked-about political campaigns on Facebook, how to turn users into engaged citizens in public action?

The old new media
Newspapers are struggling, TV is not sure of what the future holds. What is at stake nowadays when informing, reaching and involving people?

For this edition, the good friends from Bread and Butter did a great job to instantiate our theme in a proper and original graphic identity. As they explained on their weblog:

We tried to find a new way to represent the fragile balance between connected groups of people. We are all sometimes influencers, sometimes pirates and sometimes just an audience. Therefore the concept of a “mobile” seemed just the right transcription. Without saying that it also fits the Conference’s spirit and is easy to apply on all applications from website to stickers and from Marseille themes to korean’s alphabet.

The different steps from their generative metaphor:



The Application and Management of Personal Electronic Information

Posted: November 10th, 2009 | No Comments »

Recently the First International Forum on the Application and Management of Personal Electronic Information, organized by the MIT SENSEable City Lab, gathered many stakeholders from multiple disciplines to share on the issues surrounding the application and management of personal electronic information:

The goal of this forum is to explore the novel applications for electronic data and address the risks, concerns, and consumer opinions associated with the use of this data. In addition, it will include discussions on techniques and standards for both protecting and extracting value from this information from several points of view: what techniques and standards currently exist, and what are their strengths and limitations? What holistic approaches to protecting and extracting value from data would we take if we were given a blank slate?

Luckily, many of the position papers and presentations are now online.

Several contributions look at other field such as health care to draw best practices of personal logs storage and mining. Particularly, in Engineering a Common Good: Fair Use of Aggregated, Anonymized Behavioral Data, Nathan Eagle argues for the necessity of a set of standardized protocols for behavioral data acquisition and usage to preserve both individual privacy and value of the community. Nathan has been analyzing behavioral data from mobile phone operators to help epidemiologists modeling human movement to support the allocation of malaria eradication resources in Kenya. With similar data, he supported planners of Kigali in quantifying the dynamics of slums and the social impact of previous policy decisions ranging from road construction to the placement of latrines (see Artificial Intelligence for Development). Still there are two major issues in the use of these data even if anonymized and aggregated:

  • Deductive disclosure: the nature of behavioral data is such that very few observations are required to deduce the identity of an individual. An issue that is overcome to some extent by strict data sharing protocols that ensure the data cannot be released to the general public. Other strategies my apply to some extend as well (see On Locational Privacy, and How to Avoid Losing it Forever and Jon Reades’ Using Finite State Machines to preserve privacy while data mining the cellular phone network)
  • Data retention and erasure: the inability of individuals to remove their data from these aggregate datasets. Good practices can be gained from the medical community that pushes for legislation enabling individuals to own their personal health records to prevent this type of exploitation. Similarly, there is also pressure for legislation on the ownership of personal behavioral data, providing individuals with the right to access and remove their data from corporate databases enabling them to ’opt-out’ from any type of analysis. This leave me wondering to what extend the opt-out impairs the quality of tha data?

Despite the necessity of rigorous data-sharing protocols, Eagle also considers of intellectual property of data can be considered as a form of intellectual property.

The behavioral IP of an individual should be owned by that individual, and licensed to third-parties for a fee if desired. The behavioral IP of a society should be considered as a valuable public good.

This certainly opens new interrogations on the applicability of this proposal (e.g. who determines the fees, who has access for free and who does not?, how to finance the efforts that transform data into valuable public good? are the developed algorithms also a public good?). In addition to discussing the IP of the data, I often argue on the necessity to apply transparent processes in which everybody is aware on the mechanisms to generate the information (see my World Information City Doggie Bag).

On that very aspect of data process (and its transparency), I was intrigued by Trevor Hughes’ (Executive Director, International Association of Privacy Professionals) intervention on “Data Environmentalism” that argues that we should focus less on “notice and choice (fair information practices) and actually put our efforts in in securing data, data flows, and legitimate use, to the point of developing Indicators of trust Transparency.

Another aspect on the exploitation of personal electronic information lies around the notion of dream of the perfect technology and the myth of the perfect power (see Stephen Graham at World Information City). It is one of the theme that Aguiton et al. cover in their contribution Living Maps: New Data, New Uses, New Problems quoting Bruno Latour in Paris: Invisible City:

Megalomaniacs confuse the map and the territory and think they can dominate all of Paris just because they do, indeed, have all of Paris before their eyes. Paranoiacs confuse the territory and the map and think they are dominated, observed, watched, just because a blind person absent-mindedly looks at some obscure signs in a four-by-eight meter room in a secret place.

On the application front per se, it is very well worth checking the recent research of Skyhook Wireless on their own data (Aggregated Location Requests) to perform time/space based analysis, frequency/phase domain extraction and baseline/anomaly detection.


Talk at Lift@Home in Geneva

Posted: November 10th, 2009 | No Comments »

Yesterday, I delivered my last formal talk of the year at the Lift @ home session on “Urban informatics / Les nouveaux paysages numériques“, organized by Nicolas Nova in the Lift Conference premises in Geneva. This event was part of the urban informatics workshop series Nicolas and have been running. I played the role of the utilitarian to engage the audience on the potentials benefits of exploiting the logs of digital activities in our contemporary cities. My established spiel was enhanced with some insights from a recent study of crowd dynamics at the Puerta del Angel/Rambla area in Barcelona. As usual, the slides of “L’analyse urbaine à partir des activités numériques” are online for your downloading pleasure.

It was a pleasure to finally tag team with Boris Beaude from EPFL who brought his geographer’s reading of the notion of digital spaces and the maps they entail (read “Internet, un lieu du Monde” in the book L’invention du Monde, and see his courses at SciencePo Enjeux politiques de la géographie and Théorie de l’espace at EPFL). His insights help raise the kind of reflexive awareness need to reduce the effect of map designers’ personality/background on what is finally produced (see his recent paper Crime Mapping, ou le réductionnisme bien intentionné). He delivered a compelling argument on the reductionism of crime maps visualizations, highlighting the classic misleading error of calculating the density of a phenomena from the density of residents. Furthermore, these representation rely on citizen’s declarations, while it is well known that the most dangerous areas of a city are where there is a fear to report crimes. Among other issues, this calls the attention on the lack of critical thinking on “what does this information informs us on?” and who is responsible for the mishandling and misrepresentation of the data?

Lift Workshop @ Lift office
Boris Beaude at the improvised cabaret in the Lift Conference premises

The third speaker, Pascal Wattiaux discussed the role of technologies in the production of the olympic games. Each of the project run for at least 10 years, with each candidacy strongly embedded into the city planning, compressing 30 of development into roughly 7 years with no escape and a constant acceleration and organizational ramp-up (growing from 350 to 150.000 people in a few years). The games experience goes from the preparation of the games, through the production of the games, and the legacy of the games. It must be in sync with the expectation of the various stakeholders (public, athletes, workforce, sponsors, municipal, regional and state governments, etc).

In that unique context, technologies constantly offer opportunities in both revenue opportunities and cost savings. However, with the constant evolutions of technologies, it is hard to build “best practices”, therefore organizers report on “best experiences”.

Nowadays there are opportunities in the analysis of the spatial dynamics of the organization, could improve the spectators management (the stadium need to be full, it is a question of image), reduce the staff of volunteers, or organize the emergency operations with specific language competences.


How do we Avoid a Digital Dump in our Backseat?

Posted: November 4th, 2009 | No Comments »

So yes, cities are all about difficulty. Most of the urban systems designs and scenarios out there are about to reduce their complexity adding layers of technologies and information (see The “Quants”, their Normalizations and their Abstractions). But we will never design complication out of the world, and certainly not with all the kind of instruments, practices and objects we develop. It tried to make that case in Embracing the Real World’s Messiness and in Sliding Friction: The Harmonious Jungle of Contemporary Cities. In their recent Situated Technology pamphlet, Julian Bleecker and Nicolas Nova argue on this quite nicely:

The idea of a ubiquitously computing urban setting where everything functions perfectly won’t work. We don’t even have to give the technical reasons why, we can rely on the history of failures as one often does, the things that are too often forgotten about but provide the richest set of materials for design and, despite this, are almost never considered.

At best the difficulties will shift. So is the design of urban systems about reducing complexity or make cities less intimidating? In an inspiring “melt up”, Adam Greenfield went off script and argued for the latter. Urban system are about giving more visibility to engage, to have citizens a little bit more prepared to understand the complexity and at best participate to the conversation that is the city. The design goals are both very humble and yet extremely challenging to reach. It will request us to disable the way we do design. That is, user-centered design is not enough and we have to go out and get dirty (e.g. practicing urban scouting, confronting practices or as Adam would say “Go beyond the safety nets of the practices we use”).

But even with perfect designs, the city has no guarantee of perfect outcome, citizens do appropriate the resources in an equal manner, some take advantage of the systems and have the ability to break the rules. So, as asked by an attendee to Adam’s talk, “Can we avoid a digital dump in our backseat?“. I have no real answer to that (great!) question, but it seriously questions the way that we contextualize, design and plan the integration of urban systems into contemporary urban environments. In the series of workshops on urban informatics Lift lab leads we often ask participants to “criticize” their scenarios/interventions with considerations on the (basic) implications on the different stakeholders (“win” and who “loses”?). One outcome of our resent workshop in Cornellà proved that it takes a long effort to go beyond the pretty and inclusive designs of urban system or the scenarios that discard the nasty elements that are integral part of urban life. It was not until the real end of the workshop that conflicting debates emerged.

Why do I blog this: near-future networked and digital cities are also about: Brussels’ digital garbage collectors going on strike, an alarming rate of digital syllogomania among the registered citizens of Sao Paolo, Google fined by the EU for their open data spillage of Amsterdam and Tokyo’s mayor who has to resign for sensor data smuggling. The contemporary Paris Ideal of Bicycle-Sharing Meeting Reality is yet only a weak signal.

ouch!
Even with a perfect design, the city offers not guarantee of perfect outcomes


The "Quants", their Normalizations and their Abstractions

Posted: November 3rd, 2009 | No Comments »

In the latest Situated Technologies pamphlet “A synchronicity: Design Fictions for Asynchronous Urban Computing“, Julian Bleecker and Nicolas Nova discuss the notion of “real-time cities” shifting discussion away from the hygienist model of efficiency towards unscripting the unexpected and cultivating the unusual. In a world of “open data initiatives” and “smart cities”, I have a lot of sympathy for their discourse that consider computing in an urban setting not be about data and algorithms, but people and their activities. They critique the hold of quants on the representation of the city arguing that “our relationship to the spatial environment should only be based on statistical analysis or mediated by computations”:

One characteristic of these sorts of mass city visualizations is that they operate at an abstract level and normalize the individual, averaging out all the atomic units—the people—of contemporary cities. Another dimension that is lost is the history and culture, which are not part of these representations.

Of course, the “quant” failure in the financial markets makes this idea of our reliance on spreadsheets, quantification and computation even more poignant.

And these numbers guys on Wall Street—the “quants”—were going berserk with their numbers. They were creating such byzantine computational number-crunching algorithms that no one knew how it all worked. The quants, with their theoretical mathematics PhDs, had so divorced themselves with their abstracting tier of calculation that it all was destined to collapse.

In my thesis, I intended to downplay the role of data and the unique reliance on data scientists, arguing for mixed quantitative and qualitative approaches to capture urban dynamics and support the design of urban services (see The Other Point of View). It also implies integrating other practices to question the hold of engineers, accounts and architects on the design of our cities. Julian and Nicolas use the following terms:

I suppose this is where designers could participate if they sat at the same table as the engineers and accountants and brought additional sensibilities that can vector interpretations and semantics differently, away from the up-and-to-the right graphs of instrumental progression to bigger, faster and cheaper.