MapQuest on Botox, Tele Atlas on Detox
Mike Blumenthal, publisher of the nifty blog Understanding Google Maps & Local Search recently called to my attention to Greg Sterling’s blog in Search Engine Land titled “MapQuest Advances ‘Open’ Strategy.” Greg explained that MapQuest’s was continuing down the “Open” path with OpenStreetMap and announcing an expansion of their use of OSM data in their new “open” websites in France, Italy, Germany and Spain, in addition to their existing US and UK open-based efforts. Mike asked for my thoughts on the MapQuest strategy. I emailed mike a long response (seems to be a trend with me) and thought I would expand it for you. Today’s blog (including the addition of comments on Tele Atlas) is the result.
The nub of the situation is that MapQuest is running parallel websites in Europe (at least in the UK, France, Italy, Germany and Spain) in which the Open site uses OSM data and the regular MapQuest sites continue to use “commercial grade” data licensed from NAVTEQ and others.
So what is MapQuest trying to accomplish?
The author of the MapQuest blog announcing the original “Open” site in the UK states that this effort will ultimately lead to what they, “…believe will be the best and most accurate mapping experience for all.” At about the same time, MapQuest announced a, $1million fund to support the growth of open-source mapping in the United States; specifically in the local communities that patch.com covers. As you may know, Patch is a community-specific news and information website and part of AOL. More information on MapQuest’s Open strategy can be found on Slideshare , where you can view the presentation AOL made at the 2010 OSM State of the Map meeting held in Girona, Spain or the later State Of The Map meeting in the U.S.
My opinion is that MapQuest is using the Open tactic primarily to attract or perhaps appeal to a new, younger audience. Many regard MapQuest as the Rand McNally (and I should know) of the online mapping world. Their users are commonly older, relatively unsavvy when it comes to computers, brand conscious and known to be faithful to the brands they support (but unfortunately, dying off). I think MapQuest’s “Open” strategy is an attempt to capture younger users, particularly those attracted to the lure of crowdsourcing, free map data (OSM) and open software. In some sense, MapQuest’s support of OSM is MapQuest using Botox to look younger.
Another reason for MapQuest’s use OSM is that it is an acknowledgement of the company’s need to support Patch and other AOL initiatives that are focused on Local Search. Clearly, it would be a benefit to local Patch sites if people could edit, update or augment maps of their community to reflect the street and road level changes that are missing in the map databases provided by NAVTEQ and Tele Atlas.
At the present, MQ is running both its sites supported by commercial vendors and the ones supported by OSM. I suspect that cost is not a factor in their decision, as this is a marketing strategy. At some point, it is possible that OSM will become of high enough quality to use for routing in the US, but I think that this will take a very long time to occur.
It has occurred to me that MQ may be wondering what level of accuracy is needed to provide a successful mapping and routing website that is accepted by consumers. After all, they have seen Google provide a sub-par database that does not seem to have negatively influenced the commercial success of their mapping and mapping related products. Perhaps MQ is testing the waters to see if they, too, could degrade the quality of their product and still be successful in the market. For example, the OSM map database is good where there are lots of people and contributors and not so good where there are not a lot of people or contributors. Since most users are in urban areas and OSM can be quite good in urban areas, could their data be used to please most, but not all, of the people who need routes for navigation? I think not, but MQ may be interested in taking a look at this issue from a very specific commercial viewpoint. After all, I am not sure that you will ever have a Patch page for Stull, Kansas.
I am trying to find out more information on the AOL funding of OSM and have sent for the information on applying for the MapQuest grants – just to see what kind of support they are interested in. I think they are looking for apps, but it will be quite interesting if they are looking at upgrading the data. On the other hand, my email went off over ten days ago, so maybe I don’t have the right stuff anyway.
One of the interesting issues for me is how often will MQ update the OSM data? According to the MQ Developer blog “We started out, as anyone else would, with the latest dump of the OSM Planet data. We grabbed OSM2PGSQL, a python script used for data conversion, and set ourselves up with PostgreSQL with postGIS extensions (a library that adds a lot of geographical functions and datatypes to PostgreSQL).” MQ, then, went through a number of complicated gyrations to process these data and it does not sound to me like they will be running a live version of OSM anytime soon.
So, how often will they update their version of the OSM database and take advantage of the contributions of their Patch or MapQuest Open users to OSM? Seems to me that the cycle might be slow and updates done only a few times a year. Will Patch users be satisfied with new updates on a quarterly basis? If not, the whole experiment may blow up in MapQuest’s face and the Botox may melt (unless they get a lot closer to OSM – or Cloudmade, perhaps).
At the present, it appears that the OSM content shown on Patch maps cannot be edited. What the user can do is to list an event and have the event appear as an overlay of the OSM base map. Given this set of circumstances, exactly what is the process that MapQuest will use to show the local changes that people want to make to the base map that appears on the Patch websites? I guess you would need to go to OSM, contribute your update and hope that your change survives their automatic edit-check algorithms and the actions of other map updaters who may think your edit is incorrect. Even if your correction survives, how long, do you suppose, it will take for that correction to get back to AOL for recompilation, pushing to a live site and then be available to Patch users? Maybe there is something to this association that has not yet been revealed?
So the obvious question here is how often will MapQuest update its Open site to reflect the updates of its users? Of course, this consideration raises the question of how often anybody online, or maybe even offline, updates their website or products with the most recent version of the map database available to them. I think the number is, at best, once a quarter and likely much worse than that for most services. (Of course the exception here is Google, since they seem to publish map changes almost daily in an attempt to make up for the fact that they cannot control their data. Even when they do make new changes, the new changes go missing from one day to the next anyway. Is it possible that Google Maps has borrowed the “I’m feeling Lucky” tab from their search bar and apply it to each new request for a map?)
This problem of updating navigation websites got me to thinking about the problem from a different perspective. I had believed that the problems we saw earlier this year, with the realignment of I-195 in Providence, Rhode Island not being displayed in a timely manner by any of the major online mapping providers, was caused more by the delays in compiling the navigation databases into runtime versions, than by an inability to capture the change data. As evidence of this, consider that NAVTEQ had the new alignment in their database released in January, but it took over six months for the sites that used their data (including their corporate website) to reflect these changes. Of course, OVI MAPS (NOKIA) still does not have the new alignment corrected and is merrily routing cars across the roads that no longer exist, or, at least, ones that are no longer open to vehicular traffic. So for OVI, ten months without renewing their navigation data seems to be the cycle that meets their customers’ needs. Really!
So How Does This Link To Tele Atlas?
Recently I wrote a complimentary blog about Tele Atlas and the benefits that the company should be accruing due to its use of MapShare and Active and Passive Community Input. I guess I am going to have to eat crow on that, because TA seems unable to apply the data its users are gathering to improve their own company’s products.
I probably did not tell you that a few months ago I ordered a very low cost TomTom unit from Amazon. It was on sale and offered lifetime free maps and lifetime free traffic, all for around a hundred bucks. Well, it appears that it is going to take the rest of my lifetime for me to get the changes that were made to roads months and months ago.
One of the first times I used this new TomTom unit, it tried to take me down a one-way street – but going in the wrong direction. In another case, it told me to turn left to continue my route. Unfortunately, I was on the correct street, but it takes a long leisurely loop to the left, which the TA database thinks is a left turn. Yes, by following the turn instruction, I turned off of the street that I needed to be on, as no left turn was required to continue on the planned route.
Now, most of you would not have been to upset about this, but I had just updated the TomTom unit to the latest, greatest map version available from TomTom for the U.S. that was released in September, 2010.
Let’s see, September is approximately 11 months after the changes in Providence occurred. Well, even with all that MapShare stuff that I wrote about and the examples that I showed relating how they used time-slicing to find the Providence changes, they actually seem to have missed the changes. Yep, the changes to I-195 are not in the most recent TomTom database released for its PNDs. How about that?
You know I could not resist inquiring, so I sent Pat McDevitt of TomTom an email asking how this was possible. In addition, I pointed out that the changes to Providence were not in the database used by the TomTom online Route Planner.
Pat responded as follows:
“The geometry updates to I-95 in Providence were not completed in time for the release of our June MultiNet product (highway geometry editing actually stopped in early May) but are included in the current September release. (I can send you a screenshot if you’d like proof!).”
“TomTom Route Planner should be updated with the September release of MultiNet in the course of Q4.”
I replied to Pat with this comment:
“Can I assume that the September North American Database released by TomTom for its PNDs was built on the June Multinet product and that this is why the Providence interchange is missing for the company’s latest PND update? Or is there some other problem?”
“You are correct – we release the map database each quarter to our industry customers (and our own consumer division) and they, in turn, build their products based on that content according to their release schedule.”
“So the commercial map release you’re looking was almost certainly built with our June MultiNet product.”
It is hard to imagine that a company would invest in technology as potentially useful as MapShare and then not have the time or money to analyze the data in order to produce a comprehensively updated navigable map databases.
TomTom’s market cap is now approximately €1 billion. You do remember that they agreed to buy Tele Atlas in 2007 for approximately €2.9 billion? So today you could buy both companies for approximately one-third of what TomTom planned to pay for TA in 2007.
I presume that TomTom’s “decline in fortunes” has negatively impacted Tele Atlas and its ability to finance a comprehensive update its database. Will someone just buy Tele Atlas and fix it? Please? It has lots of promise, but not if TomTom’s management is unwilling or, perhaps, unable to finance the needed changes.
Now for a final thought. Since there is a significant lag between the content you find at online map sites and the content that is actually in the providers’ current versions of these databases, who wants to win the online derby by selling transactions from their up-to-date database and avoid the whole problem with these compilation cycles? I had thought that the answer was NAVTEQ, but maybe not.
Well, off to San Francisco in the morning (Monday) for the GPS Wireless 2010 meeting on Tuesday, followed by CTIA and the Locations and Beyond Summit on Wednesday. If I find out anything interesting, I’ll write. Sorry if any typos popped up in today’s blog, but it is now 1 AM and I need to get some sleep before my flight.
Posted in Authority and mapping, CloudMade, Data Sources, MapQuest, Mapping, Mike Dobson, Navteq, Nokia Ovi Maps, OSM, Tele Atlas, TomTom, crowdsourced map data, map compilation, openstreetmap, routing and navigation