Exploring Local
Mike Dobson of TeleMapics on Local Search and All Things Geospatial

Urban Mapping’s Data Products

August 19th, 2009 by MDob

Urban Mapping, founded in 2006, is San Francisco based, has a staff of approximately 10 and is claimed profitable by its CEO Ian White. Although I had known about Urban Mapping for some time, my interest in the company was elevated when I read about their turnkey solution for pedestrian Navigation. Recently, I spoke with Ian about his business and the company’s goals for the future.

For those of you who do not known Urban Mapping, it is a company focused on collecting and distributing high-quality, difficult to collect, geographic data. Ian is a firm believer that local knowledge does not scale and that this makes the local search problem hard to solve. When the “perishability” of local data is added to the equation it suggests that new techniques are required for both the collection and distribution of these data, if the industry is ever the gain the true advantages of “local search”.

Urban mapping appears focused on Neighborhood Name databases (containing neighborhood names, relationships and postal codes, etc. for the United States, Canada and Europe), Mass Transit system route and service information, Parking availability, and a product that the Company calls GeoMods, a toolset for producing “geographic modifiers” in the form of spatial keywords that define a locality and can be used for effective geotargeting, particularly in online advertising. As noted above, the Company has recently added a Routeserver product that supports multi-modal routing.

While every company in the spatial data collection business is also in the distribution business, it occurs to me that many of these companies do not really understand the dynamic nature and cut-throat realities of the distribution world. Urban Mapping, however, claims that it spent a great deal of time thinking through both the collection and distribution aspects of a business based on local geographical data.

Ian believes that Urban Mapping is very good at a data collection process that he describes as a “Tightly-coupled end-to-end recruiting, sourcing, maintenance, reporting and invoicing process to leverage field-research.” In-field research is a critical component of their sourcing, as is the use of remote sensing sources, crawls and scrapes, agency cooperation (and the occasional Freedom of Information Act lawsuits for those who are not so cooperative with public data), as well as a host of automated tools for change detection that are used to verify and enhance basic data collection.

It is my understanding that Urban Mapping fields research teams on an ad hoc basis (many are stringers) who collect data on an as-needed basis. While the company would prefer to work with data files, it realizes that current databases for some activities are simply are not always available and sometimes must be collected by feet on the street. For example, Urban Mapping’s Parking data includes parking lots, parking structures, entrances, exits, lot geometry, attributes (hours of operation, payment methods, valet, capacity, etc.) as well as rate tables for special event parking, oversized vehicle parking, early bird, etc.

The Urban Mapping data pipeline is a complex process that uses an impressive array of automated tools for quality assurance, data validation and timing updates. These processed integrate a number of activities leading to the capability of exporting data that the company believes to be of higher quality than can be found elsewhere in the industry. On the distribution side of the equation the company has developed on-demand 24/7 hosted geo-services available via API that accesses the products described above. In essence, reverse geocoding to neighborhoods, mass transit routing, and search for parking/rate calculations can be supplied directly to clients so that they can take advantage of the Urban Mapping’s most up-to-date data. In addition, custom-rendered map tile baking and hosting are provided by the company.

In a nutshell, Urban Mapping’s strategy is to amass a high quality inventory of difficult to collect spatial data. In turn, the Company provides data API’s to assist clients in the use of these data, as well as visualization API’s purpose-built to allow the client a faster time to market.

I noted above that Urban Mapping is profitable and I have no reason to doubt Ian White’s statement on this topic. However, I suspect that Ian and his team must spend most of their day feeling like Salmon attempting to get to their spawning grounds hundreds of miles upstream and at a much higher elevation than they experienced at the start. Although Urban Mapping has an impressive client list (including such gems such as Yahoo, Google Microsoft, MapQuest and Yellow Pages.com (to name just a few)), I suspect it is unlikely that any of these clients are willing to pay a reasonable price for the data quality that Urban Mapping produces.

From my days as the Chief Cartographer at Rand McNally, followed by a stint as CTO at go2 Systems (an early player in mobile LBS) and now as Principal at TeleMapics, I have found that end-users control the pricing of data through their purchasing preferences. Users of PNDs are unlikely to have a preference for map data (i.e. between NAVTEQ and TeleAtlas), but may have a preference between TomTom and Garmin devices. Most likely, however, they will have a preference for functionality, but be unable to appreciate the differences between the maps from one system to another. Even those who do would likely not be willing to a pay an additional $20 for better data, because they have no method to equate spatial data and value. To customers data is data – and it just comes with the thing they are buying. Or, if they are online – it just the spatial comes for free, as does the entire mapping and routing service.

Although product companies can influence the pricing of products based on brand power, data companies have little leeway in markets where there is not a monopoly or duopoly. For instance, NAVTEQ and TeleAtlas, although their market strategies often seemed to evidence that they were unaware that they were a duopoly, have had the ability to influence pricing in the PND market, which needed the data they produced in order to have a product that would be useful in the market. PNDS with just a database of Austria or Idaho, for instance, would likely not be big sellers. In North America the PND manufacturer that hoped to be successful needed to have a navigation data for the complete road network in the United States and Canada. Similarly, PNDs manufactured for markets in Europe needed to have all of Western Europe and now coverage of Eastern Europe is in demand. Navteq and TeleAtlas remain the sole sources of comprehensive data for the products described above and, if careful about their bidding, can set the market price for their data to reflect a more realistic value, reflecting that fact that these data are a scarce resource and difficult to collect for the specific use of navigation.

However, the advances of Open Street Map and other ventures, such as Google’s Map Maker and Street View, just might change the complexion of the market by creating new competition. In essence, where there is significant competition among spatial data producers, prices for spatial data will plunge. The problem with this trend is that increased competition will not decrease the cost of collecting data in a manner that results in high-quality data. What may happen is that the boutique-type data collection firms may find the going difficult and data quality may plummet, if they cannot defend the uniqueness of the value of the data they collect.

While I have an understanding of the lengths that Urban Mapping and others go to collect parking lot data, neighborhood name data, or connect and ride time data for mass transit systems, I also know that the end user does not know how to value these data. After all they get it for free online, how can it be very valuable? Indeed, I suspect the Google’s and Microsoft’s of the world also do not know, or perhaps do not want to know how to value these data, which, in the long run, will make it difficult for Urban Mapping to continue its market success.

I have written extensively about User Generated Content in the form of crowd sourced mapping and these developments cloud the future success for collectors and distributors of spatial data. I think it likely that crowd sourcing will eventually change the market for spatial data, especially its economics. Unfortunately, the trends are not yet established. In addition, the “fitness for use” issue and the ‘permissions” accompanying crowd sourced also remain unclear at this time.

However, Ian and his Urban Mapping team are very determined. Perhaps they will discover new ways to further decrease collection costs, increase data quality and leverage their distribution channel in such a way that they can create ever increasing stores of valuable data.

Click for contact Information for TeleMapics

Bookmark and Share

Posted in crowdsourced map data, Data Sources, Google, Local Search, local search advertising, map updating, Navteq, Nokia, TeleAtlas, TomTom


(comments are closed).