Google, Navteq and Map Compilation
And Then There Was…..One?
For this blog I had intended to write an in-depth analysis of efforts of Google and Navteq related to the map compilation strategies they use to produce navigation quality map databases. However, I have decided to focus on the major differences in their approaches, namely field observation and the use of crowdsourcing, since these should be the deciding factors in determining which company is able to produce maps with the desired coverage, currency and accuracy. Unfortunately, there are a couple of additional factor that may have more bearing on the outcome of this competition than any of the technical issues. Of course, I have learned something from journalists and novelists over the years and will conclude this blog by integrating these mitigating factors into my thoughts on which company will dominate the world of maps, mapping and, perhaps, spatial data.
Fans of OSM might be disappointed to learn that I do not consider OSM a contender for this crown, although I do think the organization has a bright future, but perhaps one that is not as map-centric as it is today. Fans of TomTom will wonder why I have not included Tele Atlas as participant in the “Map Champion of the World” competition. Well, it is my opinion that TomTom has all of the components to be a great competitor, but lacks the financial ability to implement its map compilation strategy in a comprehensive, robust manner.
Google and Navteq
Those of you who have paged through the PowerPoint in my last blog will have noted that I am convinced that the winners in the world of map compilation will be those who wield a hybrid approach to the subject. The hybrid compilation approach that I envision melds 1) traditional compilation techniques (e.g. field work/field observation, data gathering/data mining, use of imagery, conflation, data editing, data QC/QA) as practiced by staff with a professional training in mapping/GIS, or by staff who have received training in map compilation techniques with 2) crowdsourced gathering of map data. Make no mistake, crowdsourced map data is a required ingredient for success, just as much as is the use of established map compilation techniques.
How do Google and Navteq differ on the two factors of greatest importance?
The field work advantage goes to Navteq.
Field observation can be structured in several ways, but most often the strategies result in either field work that is sometimes directed in a top-down or conceptually driven manner, while at other times it is directed using a bottom-up or data driven strategy. My belief is that the best map compilation programs must mix a spoonful of each approach to create the winning elixir. In other words, map compilers need to have a conceptually driven database update program that reflects their best guess about areas covered by their map base that are: of strategic importance to leading customers, areas of unusually fast growth/economic development, or areas of significant change. In addition, most “players” in the map compilation business have an established program to “refresh” all coverages in their database on a cyclical basis, based on the notion that data change over time, new sources evolve and that all areas need to be reviewed for changes on a known cycle (although these cycles may differ by geography).
Conversely, data compiled in a previous field canvass can be riddled with errors (either in the original map compilation or with data that have changed since the last compilation). Most often these errors are discovered by map users or the customers of the map database provider and this is an example of data driven compilation. If the area (such as the boondoggle on I-195 in Providence, Rhode Island) is important enough to customers and users, a field team will usually be deployed to unravel the ground truth if other methods of collecting and correcting the data cannot be accomplished through the use of surrogate data.
I think it is an underappreciated fact that having customers with a vested interest in the accuracy of a company’s map database is an important part of the success of map compilation efforts. Unsatisfied customers have leverage in the map compilation process as they are extremely unlikely to accept blame from their users on a map error that is not their fault. The can and often do wave their contractual agreement at the map compilation company to help the data providers understand the “need” to update maps in areas the customer feels included particularly egregious errors.
While “most” companies listen to their customers and the users of their data, customers, especially ones who contribute significantly to revenue streams, command attention. The more important the customer, say BMW, the likelier it is that the offending change will be corrected sooner rather than later and if there is an offending area with numerous compilation errors, the more likely it is that a field team deployed or tactical field representatives contacted and deployed to determine the actual condition on the ground.
Smaller customers command less attention and mere users of products like you and me are, often, even further down the “sensitivity” chain. The reason that I make this distinction is the Navteq has a number of customer accounts that are extremely important to them and who can snap the compilation whip when they are unhappy with the current state of the Navteq database. Google, on the other hand, while having has customers, only indirectly provides support to them (e.g. by the location of their business as a POI or as a map of the location in a Google “local search”). Google’s map compilation system is really a data driven system led by users rather than customers. In other words, the impetus for Google to make changes to their maps is often public embarrassment about their map gaffes highlighted by users, rather than by the direct requirements of customers who are deploying map database in an attempt to solve a specific problem such as in-car navigation.
I think this distinction is important because it is likely that many map errors never get communicated to Google by its users. For instance, how many times have you corrected a Google Map? Of the errors that do get communicated to Google, some users attempt to use Google Map Maker to correct those errors, and the corrections are then vetted by Google’s crack editing team based in India, who, of course, have a complete lack of local knowledge on which to evaluate these proposed changes. The Google error reporting process is one that is unlikely to generate a field response. Yes, the Google Street View vehicles do operate on a top-down deployment, but it is unlikely that they are deployed as a data driven response mechanism.
While electronic and image sensing activities are conducted by both Google and Navteq, the Navteq field teams and contacts are prepped with specific objectives and provided with task lists before they enter the field and human observation of spatial characteristics is often a significant part of their activity. If Google cannot sense these data, find them through data mining or imagery classification, it is unlikely that they will ever discover the richness of data that Navteq operators are able to observe in the field. In essence, if Google cannot find the solution using an algorithm, it needs to be found through crowdsourcing or not at all.
We could spend more time discussing why Navteq’s field operations return better data accuracy than those of Google, but that would be kicking a dead horse –something for which I am known, but do not feel like doing today.
The crowdsourcing advantage goes to Google.
Oh stop! Navteq can tell me a million times how much crowdsourced data it now receives and that still will not convince me of the benefits that this is bringing the company.
There are two types of crowdsourced data. Active data is contributed by a user when they find an error or omission on the map that they willing to fix, or at least to contribute information on what they observed on the ground, as opposed to what they saw on the mapped database. Passive crowdsourcing is the use of GPS signals from PNDs, car navigation units, Smart Phones or other devices equipped with GPS or capable of tracking other RF (such as Wi-Fi). While a significant amount of data can be extracted from paths, it is mostly related to geometry, position, data flows (for traffic analysis) and the like. Passive data does not provide attributes such as names, addresses, zip codes, contact numbers, roadside furniture (see, I told you once that you needed to read that GDF manual), and other important information. The vast majority of the crowdsourced data received by Navteq and incorporated into its databases is passive. Navteq has very limited inputs of active crowdsourced data and gets very little benefit from the real-world knowledge held by its users and the users of its customers’ products and services.
Google, on the other hand, receives significant amounts of both active and passive crowdsourced data. Its active crowdsourcing through map corrections and Google Map Maker corrections provides the company an enormous benefit, although it has not yet managed to create a system that provides them all the benefits that active crowdsourcing can supply. However, my purpose here is not to expound on a better system, but merely to point out that the advantage in the use of crowdsourcing clearly benefits Google Maps, while Navteq’s efforts in active crowdsourcing are too limited to provide any significant benefit to Navteq, other than these data may be used as change detection beacons to help target their field activities.
Crowdsourcing operations are relatively inexpensive to create and maintain. The primary factor is whether your map can be distributed to enough users to bring in a beneficial number of map corrections through active crowdsourcing. Google’s distribution channel is massive in terms of size and reach, while Navteq’s distribution channel is modest, being limited to its Map Reporter and similar relatively unknown resources. Yes, users of Navteq data can and do refer their users to Map Reporter or provide other access for contributing error reports, but, in total, these inputs are insignificant.
So Why Are These Things Important?
Field operations are extremely expensive. While the data quality advantage that Navteq now maintains over Google and everyone else is a competitive advantage, it is unclear to me that Navteq has the financial resources to maintain this expensive advantage.
If you remember back to 2008, one of the reasons that Nokia acquired Navteq was that it was contributing mightily to the Navteq revenue steam and was trying to negotiate lower rates, but Navteq would not budge. The negotiation, then, took an unexpected turn and Nokia told Navteq “Your data is too expensive, so we’re going to buy the company.”
Guess what happened next. Yep, those post-acquisition intercompany transfer prices produced the pricing concessions that Navteq would not offer in the license negotiations. Guess what? Yes, Navteq’s revenue stream has suffered because of this process.
Nokia’s latest innovation is that it has decided not to run Navteq like a for-profit, stand-alone business, but to integrate it into some ill-conceived Nokia business unit called “Location and Commerce” (LAC – now they just need to add knowledge and they can call the business LACK) under Michael Halbherr. Larry Kaplan, who as CEO has managed Navteq for the last couple of years and understood the complexities of the business and business model, will depart the company at the end of the year. Michael Halbherr, apparently has limited experience in building map database or in managing map database companies, but heck, he has familiarity of selling products that use the data from his stint as CEO of gate5 AG, another of Nokia’s “successful” acquisitions. Good luck Michael and please try not to kill off the only substantive alternative to Google Maps.
Hmm. I guess that means that Navteq better get used to hearing “Your field operations cost too much and you need to reduce expenses because the Location and Commerce business isn’t producing any revenue other than the revenue you are generating. Of course, your revenue is what is funding our amazing, Finnish-style, sauna-driven management structure and market-mismanagement structure. And, by the way, why aren’t you using more of that low-cost active crowdsourcing as a replacement for your field operations?” (If this question is asked, those of you at Navteq should take a quick peek at the slideshow mentioned in my last blog.)
You may remember that in 2008 I told you that Nokia wanted to become an advertising company. This statement was based on my assumption the only way to make money with maps and phones would be to sell advertisements to people who are searching for local attractions. I concluded then that Nokia did not have the wherewithal to make this transition and I think that opinion is still valid. Nokia’s CEO Stephen Elop indicated that “Focusing on location and commerce is a natural next step in Nokia’s Services journey.” I guess somebody must be familiar with the amazing successes of “Nokia’s Services Journey”, but I am having a hard time coming up with any service related success that the company has achieved.
In my last blog, I mentioned rumors about the SS Navteq being set afloat based on some rumors that I had heard from contacts at several conferences held over the last few weeks. We all knew that something unpopular was going on at Navteq, but my guess was before I became aware of the Nokia reorganization of Navteq. As a consequence, a change in ownership might not be in the immediate future. On the other hand, I suspect that within the next two years, Nokia itself will be sold or decide that it is unprepared to successfully compete in the location and commerce business and sell the location and commerce business to someone, say like NavInfo.
The real problem here is that Nokia is slowly beating the competitiveness out of Navteq and it staff. In addition, I suspect that Google will eventually find the right strategy in crowdsourcing and data mining and begin to create data that can actually be used for navigation, as opposed to routing a table of points, as they do today. When this happens, Google will be able to compete with Navteq across a wide variety of markets. Navteq, of course, could maintain, or even extend their current lead over Google, but it appears that malaise has gripped the organization and it may be that the company’s workforce no longer believes that it can maintain its market leading position. From a practical point of view, I believe that sentiment to be untrue, but I can understand why the Navteq workforce has doubts about the future.
Let’ see how things have worked out in the world of navigation map databases. TomTom acquired and, then, killed Tele Atlas through mismanagement brought on by financial difficulties. Nokia acquired and is killing Navteq caused by general incompetence and financial difficulties.
Who will be left in the world of mapping? Google, that’s who. So Microsoft, if you want to have mapping and routing in the future, either buy Nokia or adapt OSM and do what you can with it. Of course, it might just be cheaper to buy MapQuest, since they already seem to know what to do with OSM.
Happy trails – and Navteq, I’m hoping you will not abandon your market leading position without a fight, although that might involve fighting Google and Nokia.
Now, it’s time for me to fight the final boss in Red Faction Armageddon. Hope I survive.
Next time, unless someone beats me on the head, I plan to explore some new topics. Stay tuned.
Posted in Authority and mapping, Google, Google Map Maker, Google maps, Mapping, Microsoft, Navteq, Nokia, Nokia Ovi Maps, OSM, TeleAtlas, TomTom, User Generated Content, Volunteered Geographic Information, openstreetmap