Contributions of Geoinformation and Geovisualization to the Concept of the Digital City

Posted: March 5th, 2009 | No Comments »

On Tuesday, my work was presented in two different venue, in Hamburg at the GeoViz workshop on the Contribution of Geovisualization to the concept of the Digital City (thanks to Ayman Moghnieh for replacing me at the last minute!) and in Barcelona at Globalgeo in a forum on geoinformation and participation for sustainability (slides of the presentation). I stayed in Barcelona to meet some of the founders of the Vespucci Initiative and understand how my work integrates into the current trends and challenges in of geographic information science. Besides the acknowledgment of volunteer-generated information, it is still a very techno-driven field, that starts with technological capabilities of to collect, process and visualize data and after a few years having conference session with the title “So what?”. No doubt that this is one what that innovation happens, but I am always fascinated by the GIS community quest of the details (e.g. the perfect 3d models of a city) without much human perspective on “how good is good enough” and “for whom and what for”. Current concerns are rather on better organize the existing geospatial data (following the EU Inspire Directive), migrate them to Internet-based environments, make them ready for spatial analysis and visualization. In relation to that, spatial data should be considered as a medium, not a message. So far the community has overlooked the maintenance of the medium, for instance in failing in archiving data (e.g. we lack of the digital data from 10 years while we still still have analog data). In addition, there is paradigm shift that drives spatial analysis to show what is happening, not what it is, implying the development of 4d spatial analysis and visualizations systems.

The move into this Next Generation Digital Earth goes through the embracement of bottom-up spatial data infrastructures. Michael Goodchild made an effort in comparing the differences in quality, trust, timeliness and risks of volunteer generated information vs. authoritative information. He particularly talked about the contrast in the quality of the top-down (authority) data where inaccuracies are guaranteed and the bottom-up (assertion) data that tend to be more accurate in popular places (e.g. similar to Wikipedia). Therefore, authorities information must be verified through a process that can be slow. On the other hand, asserted information are generated in timely manner and people are willing to accept the false positive they generate (an angle that I could have explored in CatchBob). Through several examples (e.g. fire management in Santa Barbara), Goodchild showed the necessity to consider both accurate top-down information mixed with timely volunteer generation information (e.g. such as in Zagat). My work in New York certainly goes into that direction.