More On Google’s User Generated Content Tower Of Power
As I noted last time, User Generated Content could be the data gathering tool that lets Google surpass NAVTEQ and TeleAtlas in quality of data and spatial coverage. The potential “fly in the ointment” is “How “good” does the map data that Google is collecting need to be? This “fitness for use” question is difficult for an outsider to answer, but we can make some assumptions. Presumably, Google considers the data in the Google-Mapbase to be fit for mapping, routing, navigation and route guidance. If not, why would it have dumped TeleAtlas?
On the other hand, the data in the current version of the Google-Mapbase appears to me to be of lesser quality than that provided by TeleAtlas and a wider gap may exist between Google and NAVTEQ in terms of map accuracy, especially in the currentness of map attributes. How should we think about this issue?
Perhaps this conundrum is an example of the situation that other pundits are referring to as “good enough” in reference to their belief that the navigation market may be driven to produce map databases of less accuracy than the “high precision map databases” that will be needed to support Advanced Driver Assistance Systems (ADAS) targeted at driver safety and the advanced Energy Management systems (EM) targeted at efficient use of the power/drive train in vehicles.
I guess this recently exposed “good enough” argument means that Google got rid of the data from its supplier of navigable map databases (TeleAtlas) so they could collect and publish inferior data. You know, more people should read this blog.
Google, by its own admission, could not find a way to get its supplier (TeleAtlas) to actively work to enhance the quality of its map database and resolve the inadequacies that Google had been complaining about over a lengthy period. Although Google parted ways with NAVTEQ for strategic reasons after Nokia acquired NAVTEQ, it was quite clear that Google was unimpressed by the quality of NAVTEQ’s data during the period it was a licensee of NAVTEQ.
Clearly, Google took a run at creating a navigable map database in order to improve the accuracy of their maps, navigation and route guidance capabilities. Does anyone honestly think that the company will not industriously endeavor to enhance the Google-Mapbase? Really!
Another group of pundits is claiming that now that they have had an opportunity to really examine the Google Maps Navigation application (a beta) that it is “behind” on several features and that Google will need to update its application to be competitive with features offered by other providers of navigation services. Give me a break. Does anyone honestly think that Google is going to stop its cycle of continuous improvement? Google’s application will improve and Google will continue to deliver innovative products and concepts as part of programs designed to enhance its ability to deliver targeted advertising to its customers wherever they may be on whatever device they may be using, even when they switch between devices.
It is my opinion that the “good enough” argument and the “inferior application” arguments reflect a lack of understanding of the potential revolution that Google is attempting in the collection of map data for navigation quality databases. Google’s current applications may not have all of the features of PNDs or even other navigation systems on phones. The reasons they are lacking these features is that Google does not yet have the attribute data that would allow them to provide posted speed limits, avoid toll routes, take scenic routes, or other features that are data dependent. So, the really interesting question is “Does Google have the right approach to maintaining a navigation quality map database?” In essence, “Can Google overcome the 25 year head start enjoyed by NAVTEQ and TeleAtlas?” Of course it can.
We looked at my proposed model of Google data revision activities last time and now will discuss more details on the model.
I noted that User Generated Content (UGC) will become the “quarterback” of Google’s data collection efforts. My belief is that UGC, in the form of customer complaints, map corrections, map additions, business listing updates, probe data, Google Map Maker and Street View Advertisements, will help focus Google on the weaknesses of their map database and proscribe where Google needs either to mine its Street View data or send its Street View vehicles to gather additional data. Alternatively, Google could use UGC as a diagnostic indicating where it needs to search for and, then, conflate better attribute data than it used in its original pass through a geographic area.
The potential use of “billboard” space in Street View images appears in a patent issued to Google recently and could benefit Google by enhancing its address database and business listing database through improved communication with business owners and neighborhood interest groups likely to use this advertising service. (Some may not know this, but Google AdWords has a branch that makes online advertising available to charitable institutions in an effort to help these groups publicize their charities. Imagine all of the good local information Google might be able to learn by assisting charities with their “local” advertising activities using Street View as inventory.)
Mining the “collective intelligence of Google users” is a longer term play for the company, but one that will reap substantial benefits when the Google Maps Navigation application succeeds in convincing users of phones equipped with the application to use Google rather than other navigation alternatives. If the price (free) were not enough of an incentive, users will likely find that Google’s database is more up-to-date and more accurate than the databases provided by others in the marketplace. This argument takes us back to the notion that Google doesn’t have better algorithms, it just has more data to mine and more data mining means improved navigation data, over time.
UGC, however, is generally unstructured in a spatial sense, meaning the user controls the changes and selects that area where change data is reported. In essence, when UGC is an “active” process Google responds to the changes reported, but has no ability to direct the geographic tendencies of their map error reporters. This is opposed to the field efforts of several mapping companies in which the company actively directs its field teams to canvass geographic areas based on reports of errors and, also, on the basis of a comprehensive collection process that attempts to re-canvass all map coverage over time.
It is here that we need to remember that UGC works best when it is governed by the law of large numbers – when you reach the tipping point, all bugs become shallow or, in respect to the present topic, all map corrections become shallow. The most important advantage that will accrue to Google in updating its Google-Mapbase is its future use of probe data, based on the potential input of the predicted number of users of a free navigation service available on the Android platform.
Probe data (following the bread-crumb trail of GPS signals registered by and locating your phone in space) can best be thought of as a change detection generator. When roads are closed for construction, probe data will immediately reflect the situation. When new roads are opened, probe data will immediately reflect this change by providing traces of movement in areas previously empty of such traces. If a traffic artery has been converted to one-way, probe data will immediately reveal the absence of the previously normal two-directional traffic. While probe data is not the sole panacea for improving map revision practices, it is a mechanism that will take much of the guess work out of where Google should deploy field collection assets, like Street View, to create improved map data.
Of course, the law of large numbers may work to Google’s disadvantage, especially if a large number of users object to having their paths tracked and saved in a Google data center. Even today, TomTom/TeleAtlas, whose MapShare program benefits from probe data collected from users who have opted-in to the service, strips the first two and last two minutes of travel from the probe paths they capture, to provide some degree of anonymity to the contributors of their data. While the data is contributed anonymously, it is clear that the only person leaving 6 Sesame Street each morning and returning to 6 Sesame Street each evening is likely a resident of that address. We will have to see how this issue plays out, but if it plays out in Google’s favor, it will be a game changer, especially when added to the other practices they use to gather map data.
I fully expect that Google will soon consider paying a select group of its UGC contributors for the data corrections they provide. Google, by creating an effective and feedback-equipped UGC map correction system, has enlisted the efforts of a large group of people known in the industry as those suffering from Cartosis! These poor folks are map-a-holics who just cannot get enough cartography. They love maps and would like nothing better than to spend their day editing the darn things. (Believe me, as former Chief Cartographer for Rand McNally, I was inundated with their letters demanding corrections, additions, deletions and various insights on our cartographic practices, as well as my personal lineage).
Every article I read on Google’s mapping efforts suggests that Google is benefitting from the contributions of these geo-specialists in ways that are eluding others in the map database field. Google is earning the good will of these people by incorporating their comments and making the changes they contribute visible on Google’s map displays in a relatively short time. If Google can harness the good-will of these folks, perhaps by a modest stipend – or simply an acknowledgement, they just might become the company’s best mapping-buds. While this may sound humorous to some, these people are very good at knowing their local area and sometimes exhibit significant levels of familiarity with broad geographic areas.
I suspect you are wondering why I am spending so much time on this group. Well, the one big problem for Google is that they do not really have a field collection force to gather data in areas where they know they are weak. Yes, they can send out Street View vehicles, but this is an inefficient way to resolve spatial problems that can be solved by local people. If I were Google, I’d be thinking very hard about how to incentivize their UGC data pros.
And Now, Apple?
How about this for a change in direction? Now that Apple has released their iPad (what an awful name), maybe they will turn to thinking about how they are going to use PlaceBase, the small mapping company they acquired last year. PlaceBase did some nice data integration and was known as for its data visualization efforts. However, the company did not support routing directly, although its API provided hooks for integrating routing from other services. It seems to me that Apple needs to work on mapping, routing and geography in general, but no one seems to know what they might be doing with PlaceBase. Any ideas? And don’t suggest the PlacePad, but maps on the iPad – now that’s a good idea.
Posted in Apple, crowdsourced map data, Data Sources, Geo-patents, Google, Google Map Maker, local search advertising, map updating, Mapping, Mike Dobson, Navteq, TeleAtlas, TomTom, User Generated Content