Exploring Local
Mike Dobson of TeleMapics on Local Search and All Things Geospatial

Better Maps Through Local Thinking Part II

July 25th, 2010 by MDob

I started this topic on “Better Maps Through Local Thinking” last time and will finish it next time. Bear with me today, as exploring this topic requires peeling the onion so that we have a common ground for what follows.

Recently, I was driving to San Diego for the ESRI UC, I was listening to the news and a story caught my attention. It seems that a reporter who had bought one of the new Android-based 4G phones from Sprint was enraged that he was being charged an extra $10 per month for 4-G service, even though 4-G service wasn’t available in the Los Angeles area, and might not be until the end of the year. The led me to think that almost everyone who buys PNDs or pay a monthly fee for mobile navigation services often have the same problem.

For example, most PNDs and online navigation services charge you for national map coverage, even if you never use your device anywhere outside of your hometown. I suspect the majority of PNDs are rarely used outside of their home areas or perhaps in an adjacent area encompassing a county or two surrounding their home county. With mobile phones, the use area is larger. Indeed, business users likely use mobile navigation services in a variety of locations, but mostly in locations that are spatially separated from their home base. I doubt that the average user (either PND users or users of mobile phone navigation services) creates routes in any more than two percent of the total area of the United States during the lifespan of the device or their subscription to the service.

The conclusion here is that each user helps reimburse the companies providing their navigation services or devices for offering comprehensive map coverage, even though we rarely use this option. However, the market has convinced us that we are not paying for the original map data. Instead, we are paying for the service and since the service is offered everywhere, we can use it everywhere we happen to be. (Or as quoted by the immortal Buckaroo Banzai “Wherever you go, well, there you are.”) We might conclude that the cost of the product or service reflects its ubiquitousness and our potential, but perhaps unlikely, use-pattern for the service.

In the world of paper maps, the model was slightly different. Yep, there was only one product type, the paper map and usually there were only two piles of maps in stores. One pile included national highway maps, state maps and national road atlases, while the other was comprised of city maps and city map books for various locations around the globe.

If you were going to travel around the US, you bought a US road Atlas from a national provider of maps such as Rand McNally or the AAA. If you were going to Los Angeles, you bought a LA street map or a LA and vicinity map from a national provider such as Gousha or American Map. If you were a resident of Los Angeles, however, you bought a Thomas Guide for Los Angeles County or a combo – Thomas Guide for LA/Orange counties that featured large-scale, up-to-date, comprehensive coverage of the local area. In Washington DC, you bought a map book from ADC. In Austin, Texas, it was Mighty Map, Mapsco in Dallas-Fort Worth and Key Maps in Houston. In Boston, it was Arrow Maps. Do I need to mention that, in the past, the most successful map products in local markets were compiled locally?

Somewhere along the way, somebody convinced us to buy map coverages of places we would never visit. How cool is that? On the other hand, it is clear that the new map databases had attributes that allowed routing and geographical analysis, problems that could not be addressed, in any meaningful way, with paper maps. On the other hand, local companies could solve this problem without much difficulty today.

Interesting stuff, but it is the issue of data quality that should be our focus. Is it possible that local users of map data may not be well served by national or international suppliers of map data who try to optimize coverage, currentness and accuracy across significant extents of the globe? Is it possible that local users are not well served by data based on “global” perspectives that do not embrace and reflect the true nature of place and localities?

In order to get at this issue, let’s explore “how a market in which local maps were produced by local sources changed to an environment in which most map data for use in local markets is produced by national or international sources?”

In the mid-1980’s, the precursor companies to today’s NAVTEQ and Tele Atlas began building commercial, street-level databases aimed at the, then, nascent market for navigation. NAVTEQ’s DCAs are an early example of their approach to segmenting the national market into chunkable areas that reflected markets that might be of interest to automobile manufacturers. However, from Detroit’s perspective it was clear that the market was not going to take off until there was a national product that would allow the car manufacturers to install and sell a navigation system that could be used everywhere. In other words, as some learned the hard way, it was going to be a mistake to produce an in-car system selling for over $3,000, running on a tape cassette, and then say that it only worked in Detroit now and maybe Chicago next year. National data was a must.

Around the same time people were playing with networks, ARPANET, TCP/IP and slightly later NSFnet. In the 90s, we got to the stage of Prodigy, AOL, Netscape, Internet Explorer and all of a sudden there was a perceived need for products with a national scope. In essence, the marketing powers in the early days of the Internet functioned as national distributors and these national distributors wanted national products that would play to their national audience. Three products that were in demand, but which no one was quite sure how to produce, were mapping, routing (navigation) and business listings integrated with mapping and routing.

I doubt many people today realize that until the rise of the Internet there was not a compelling need for a national business directory or an integrated, national, yellow page type of system. It was not until somebody decided they would like to provide a yellow page type of service online that they realized a comprehensive directory of businesses across the United States simply did not exist. Yep, those local yellow page directories worked just fine, because they were used by local audiences. When you needed to know something about a service in another town or in another state, you headed to the Library, whose librarians conveniently collected every YP book they could get their hands on. Sounds antiquated doesn’t it? But the point that I am making is that national products using maps, navigation and yellow pages and business directories had their births in the 90s when a new distribution channel made it possible to advertise a class of businesses that could not afford national coverage on television or radio.

There were two leading choices to solve this supply problem. First, you could try to integrate disparate sources into a seamless national database. Alternatively, you could simply find a provider who builds national databases and license their product. Those who tried the blend approach wound up in a world of headaches. Yes, could always find ways to conflate data, but you could rarely find any satisfaction while trying to understand how every source you found could categorize simple things like road function classes or business names in so many different ways. Needless to say, service suppliers decided that they did not want to try to integrate disparate data. Instead, they sought national providers who could give them a database that was advertised as consistent and ubiquitous and let them get on with the job of serving data and making money.

In essence, as distribution changed and markets for new services emerged, national players trumped local players. However, were national players able to provide better quality data?

I am sure that the providers of navigable map databases and navigation services regard each and every street as important to them and that they work as hard as humanly possible to produce highly accurate and up-to-date maps everywhere. Unfortunately, their model does not scale uniformly because they do not have a large enough field staff to collect accurate data on a timely basis in all of the localities where they need to do so. Instead, these companies suffer from the problem of contention for attention in regards to data collection. They must spread their field research teams over large areas and this often results in their missing significant changes in streets, roads, highways, other map data and attributes.

In order to compensate for the lack of adequate field staff and resources, the big three (NAVTEQ, Tele Atlas and Google) spend a lot of time data mining and conflating data originally created for other purposes. While conflation can be a very useful technique, it is often the tool of the data hungry and your hunger always determines what you eat, not what you should eat. Let us take up the rest of the story, including the three-legged stool of map compilation next time, with an eye to why a local approach or the distributed local approach (OSM) might be the best.

Click for our contact Information

Bookmark and Share

Posted in Authority and mapping, Google, Google maps, map compilation, map updating, Mapping, Navteq, Tele Atlas


(comments are closed).