Google and Map Updating – Part 1
I was looking around for a good way to start this series, one that would let me provide a little history on map updating and commercial mapping that would help put a context around the next several blogs I have planned. Luckily, someone saved me the trouble – so here we go.
I received an e-mail this morning from Mike Blumenthal on a blog he had written regarding a change he corrected and submitted, which Google corrected in 30 days (an address change). I was reading some of the comments on the article by his readers and one caught my eye. The sentiment expressed was that if people have been living without map updating for years, a few weeks isn’t going to make a big difference. The note concluded that if you don’t like the quality of Google maps, you should tell your readers to use Bing, Yahoo or another provider.
Hope Mike’s reader is not in marketing. Perhaps more importantly, the author of the comment seems oblivious to the fact that the strategy behind Google Maps is not focused on producing map views. Instead, Google switched from its previous suppliers of map data to providing its own Google-base in an attempt to enhance its ability to successfully find real world addresses and correctly map their locations. Finding the “correct” location, is a key indicator of success for Google, as is creating viable routes to these locations for its on-line users or turn-by-turn navigation directions to these locations for users of Android-based smart phones.
While it is true that paper maps were not updated often (perhaps once a year), these maps were not designed to and did not provide either the routing function or the more demanding turn-by-turn navigation function. If you wanted to use paper maps for routing, you had to learn how to read and use maps and then preplan your route, often drawing on top of the map. If you wanted to use your paper map for a navigation function, you had to have someone who could read maps sit in the passenger seat, correctly orient the map and tell you the correct set of maneuvers to reach the destination (although in point of fact, this situation resulted in more divorces than any other cause).
In fact, based on the uses to which maps were put, users of paper maps simply did not demand higher accuracy and more current updates from the map publishers. In turn, paper map publishers did not update their products in a timely manner because the costs of product creation, printing an inventory and distributing that inventory required that the inventory had to be sold through in order to make a profit.
You did notice the word “sold” in the last sentence? Yep, consumers actually had to pay for these maps, even if they were out of date. On the other hand, the paper map did not need batteries and the image was persistent or at least it lasted until the folds wore out, the staples failed, or it blew out your car’s window. The purpose of all of this reminiscing is to indicate that comments about map updating frequency based on presumably outdated media and distribution methods may not be remotely applicable to today’s environment and this notion has a bearing on why Google created the Google-base and promised try to update errors in it within 30 days.
Modern navigable map database suppliers (NAVTEQ and TeleAtlas) have what can be considered a one-item inventory – their Master Database. At present, most database providers distribute copies (sometimes on physical media) of that database to clients who, then, use it to create derivative products wrapped around a core that is the navigable map database.
When Google was using TeleAtlas and Navteq data, it was forced to import data from its suppliers, integrate these data with its mapping platform and then distribute the data on a demand basis to its users. When the data contained errors, Google had to collect the details of the errors, convey the details of the errors to the supplier who would then log the error and research it as part of their company’s normal map updating process. Fixing the errors was at the database producer’s discretion. Sometimes even if fixed, the change may have been entered in the queue to late to be included in the quarterly release.
Managers at Google Maps were quite vocal about their unhappiness with the quality of the map coverage that the company received from its suppliers. One concern was that the navigable map databases were not up-to-date, as many newer (and some older) land developments were not represented in the databases provided by NAVTEQ or TeleAtlas on as timely a basis as Google required. Second, the data from the map suppliers did not meet the quality levels required by Google for its intended applications, as address ranges, the locations of addresses, the locations of street, highways, other roads and Points Of Interest (POI’s) appeared to be erroneous or not in their correct spatial position either in a relative or absolute sense. Next, the data were not uniformly comprehensive, as the quality and number of elements used to describe the mapped data varied across the coverage area. The most problematic issue for Google was the length of time the map providers required to research and correct errors based on the details that were supplied to them by Google.
By producing its own navigable map database, Google has removed the “middleman” from their map creation system. By compiling their own map data from independent sources, Google hopes to improve the accuracy of the data they use for mapping, routing and navigation. In addition, Google hopes to gain an ability to enhance, expand and update their navigable map database in much less time than that taken by their former suppliers.
It is my contention that all suppliers of navigable map database suppliers desire to distribute their database in as close to real-time as possible. Doing so allows them to provide their best quality, most up-to-date map data as soon as is practicable. NAVTEQ, for instance, has long desired to deliver their data over live feeds, rather than by delivering physical media. Even better, NAVTEQ and TeleAtlas might prefer to deliver live routes, rather than databases, just as Google is now doing. While there are many efficiencies to be gained from delivering routes and the associated map, rather than delivering complete databases, the main reason for moving in this direction is to enhance the ability to provide accurate and reliable routing and navigation services that will allow the user to locate and travel to the address to which they desire to travel based on providing directions (routing) or maneuvers (navigation) that are both legal and possible.
In respect to Google, we should reword the statement above into this strategy – To be successful in Local Search advertising Google needs to deliver the potential customers of its advertisers to the physical location of the advertiser so that a transaction can result. If the advertiser’s advertising dollars do not result in sales because the potential customer cannot find the location of the advertiser’s business, then, Google is the partner who will be blamed for this discrepancy. A related factor is that even when there is no advertising/advertiser involved in the local search, if users cannot find businesses that were represented in Google’s index and routed to using Google’s map service, they will have less confidence in Google’s capability as a information provider and potentially consider using other mapping services. While not everyone might be concerned about this potential mismatch, it is the issue that caused Google to dump TeleAtlas and create the Google-base.
It is important to note that in 2008 Google and TeleAtlas signed a five year agreement for TA to supply a navigable map database to Google. Even though Google is now providing its own map navigation data in the United States, it is still paying TeleAtlas and apparently intends to honor its five-year commitment while paying for data it does not use.
The truth of the map licensing market is that neither NAVTEQ nor TeleAtlas has ever made a “living” licensing their data to online providers of mapping and routing. The reason for this is that the cost of collecting, compiling and updating the data contained in their navigable map databases has required significant spending. NAVTEQ and TeleAtlas were built to service the needs of the in-car navigation market, the one that pre-dated PNDs. Indeed, PNDs and online mapping/routing were not part of the original sales strategies of either company, although both of these market segments became important over time.
In turn, the amount that Google paid for its data licenses pales in comparison to what is must have cost to build the Google-base. In my opinion, building and updating a map database is like borrowing money from the broken nosed goon in the bad sports coat – the paybacks just never end.
On the other hand, we all know that Google thought through these issues and still believed that it had found a better way to create and update navigable map databases. So, the question becomes, “Is this a case of new technologies causing great firms to fail (as described in Clayton Christensen’s Innovator’s Dilemma), or has Google missed the mark on this one?” Let’s think about that as we begin to discuss the specifics of map updating.
In my last blog, I wrote about the accuracy of the Google-base in my neighborhood and it appears that Google Maps clearly missed the mark. The comparable map data from NAVTEQ and TeleAtlas were superior to those provided by Google. I suspect this does not come as a surprise to Google and believe that they have problems similar to the ones I reported almost everywhere. However, there the issue of interest to me is “How is Google going to remediate this problem?”
NAVTEQ clearly feels that data mining is a great way to build a database, but also believes that you need to have research teams in the field, if you want to build an accurate and reliable database. TeleAtlas also believes in the efficiency of data mining, but believes that probe data (data from GPS units that records the paths users take while driving motor vehicles) and other User Generated Content can help to increase the accuracy of their database while decreasing the expense of field research. And Google? Well, I’ll spend the next couple of blogs diagramming what I think they are doing in map updating. Hopefully, I’ll have some diagrams of how the process works ready for next time.
By the way, it appears that several VC’s are thinking about funding companies who could become the next NAVTEQ or TeleAtlas. Better point them at my next few blogs before they invest your hard earned dollars.