Exploring Local
Mike Dobson of TeleMapics on Local Search and All Things Geospatial

Gate Keepers, Digital Gazetteers and Folksonomies – Part 6.

March 24th, 2009 by MDob

Last time we ended up looking at Muki Haklay’s analysis of OpenStreetMap and how few people were actually responsible for the creation of the OSM of England. I introduced Dr. Haklay’s work on OSM as a possible indicator of the dynamics behind Google Maps. I have no way of knowing whether or not Haklay’s work is a useful surrogate in this situation, but my intuition tells me that it is something to consider. If you are OK with that, you should continue to read. If you want me to do the research, then call Google and ask them to let me do the research. (By the way, I have not asked Google, as this is just a passing interest).

One of the often cited arguments by supporters of UGC in map database compilation is a slight twist on Linus’s Law, that with more eyes all bugs are shallow. In the case of map compilation a rephrasing of the presumed intent of the Law would be something like “With more eyes, map data errors are discoverable.”

Based on the results of Muki Haklay’s examination of OSM’s work creating their UK database (see our last blog below), it is difficult to assume that the effort that went into developing the UK database is a good example of the benefits of numerousness mentioned by Linus Torvald. It seems that more eyes were not a factor in creating the OSM database and that the expertise in the database creation is held by only a few critical volunteers. Although I have no objective basis for this, I suspect that the number of volunteers who actively contribute to Google’s Map Maker project follow the same trends that Haklay discovered with OSM. There are likely a few who contribute a great deal, and a larger group who contribute a little at the start and then trail off, almost immediately.

In turn, this apparent limitation in compilation/sourcing assets raises the question as to whether or not volunteers can be incented over time to create sustainable spatial databases. In fact, I think that this is the biggest problem facing Google’s map making efforts and is why Google may buy TomTom (you really didn’t think that Microsoft was suing TomTom on a patent infringement strategy did you?).

Companies that are focused on creating map databases engineer spatial data quality into their data compilation systems. Accuracy of position (resolution), accuracy of attribution (logical consistency), completeness (both of spatial coverage and attributes), temporal relevance and metadata are just some of the activities mandated by most companies interested in map compilation. (Speaking of metadata -Navteq had metadata, TeleAtlas has metadata, but does Google and does it need – or perhaps want it?)

It is this effort to actively harmonize the spatial data compiled that distinguishes database building efforts. I think this is a crucial issue when comparing professional and crowd-sourced compilation. Who directs crowd-sourced data from an editorial/functionality perspective? Who sets standards for crowd-sourced data? Who quality controls crowd-sourced data (e.g. when is Map Maker data good enough for Google to use it in Google Maps and what are the barriers it must pass)? What external guidance exists in crowd-sourced compilation systems? It is questions like these that Google now faces on a daily basis and that will eventually scream for the company’s attention. What might Google’s strategic responses be?

1. Buy a mapping company. (Did you notice that TomTom’ cap is now €443.94. Yes, this is the company that bought TeleAtlas for € 2.9 billion. Just might be a deal here for a savvy buyer. Buy the debt, take the company into bankruptcy, recapitalize and you own TeleAtlas for a significant discount. Of course, the EU will make doing all of this very difficult. )

2. Professionalize part of their map collection strategy by augmenting Map Maker coverage by adapting and formalizing the street view collection process, while transforming it in to a map compilation process.

3. Work with National Mapping Agencies to make authorized data available for free or for a low fee. Google has a history of developing strategies that unlock markets (wireless spectrum, for example) and then dropping their initiative when and if, other parties (who are the natural players – like wireless carriers) decide to play in this arena.
a. If Google’s geospatial effort unlocks the stranglehold that many national mapping agencies have on spatial data (it’s hard to charge for it when other are giving it away) then Google may back away from this activity at some future point in time.
b. It will be interesting to see the reactions of national mapping agencies to Google’s efforts, or the efforts of Open Street Map to achieve much the same thing. Of course, Google adds the distribution channel that OSM lacks and could, theoretically make collecting these data a profitable business. Note the profits will be made on advertising, not on the sale of map data.

4. Provide their UGC data (Map Maker) to their main mapping partner (currently TeleAtlas) who will review it, edit it and use it to advantage both parties.

a. There is a coverage problem related to areas of interest to Google but not covered by TeleAtlas. Google could provide some of these data to TeleAtlas and other data to other suppliers (e.g. the ANDs http://www.and.com/ of the world) for some concession.

5. Give the Map Maker data to OSM, while retaining a right to use/license the current and future versions of the databases.

6. Transfer data to an internal Google group whose job it is to maintain these data at a quality level sufficient to meet the needs of Google’s mapping and advertising systems. Yikes – Google becomes a real mapping company!

The likeliest path for Google and the mapping industry is one that combines professional mapping and UGC or crowd- sourced compilation. In fact, this is where the entire industry is moving – if not now, then soon, even though some of the major players seem not to understand this dynamic.

Next time, Let’s discuss this hybrid approach to compilation and then weave back to street maps, reference maps a gazetteers – and the idea that started me on this road almost two-months ago.

Neogeography and UGC -  is this part of the prescription for success in map compilation?

Bookmark and Share

Posted in Authority and mapping, Data Sources, geographical gazetteers, Google Map Maker, map updating, Microsoft, Navteq, TeleAtlas, TomTom, User Generated Content


(comments are closed).