Exploring Local
Mike Dobson of TeleMapics on Local Search and All Things Geospatial

MapQuest on Botox, Tele Atlas on Detox

October 3rd, 2010 by MDob

Mike Blumenthal, publisher of the nifty blog Understanding Google Maps & Local Search recently called to my attention to Greg Sterling’s blog in Search Engine Land titled “MapQuest Advances ‘Open’ Strategy.” Greg explained that MapQuest’s was continuing down the “Open” path with OpenStreetMap and announcing an expansion of their use of OSM data in their new “open” websites in France, Italy, Germany and Spain, in addition to their existing US and UK open-based efforts. Mike asked for my thoughts on the MapQuest strategy. I emailed mike a long response (seems to be a trend with me) and thought I would expand it for you. Today’s blog (including the addition of comments on Tele Atlas) is the result.

The nub of the situation is that MapQuest is running parallel websites in Europe (at least in the UK, France, Italy, Germany and Spain) in which the Open site uses OSM data and the regular MapQuest sites continue to use “commercial grade” data licensed from NAVTEQ and others.

You might want to take a look at the results. MapQuest’s Open website for the UK that can be found here and, then, their regular site that can be found here for a quick comparison.

So what is MapQuest trying to accomplish?

The author of the MapQuest blog announcing the original “Open” site in the UK states that this effort will ultimately lead to what they, “…believe will be the best and most accurate mapping experience for all.” At about the same time, MapQuest announced a, $1million fund to support the growth of open-source mapping in the United States; specifically in the local communities that patch.com covers. As you may know, Patch is a community-specific news and information website and part of AOL. More information on MapQuest’s Open strategy can be found on Slideshare , where you can view the presentation AOL made at the 2010 OSM State of the Map meeting held in Girona, Spain or the later State Of The Map meeting in the U.S.

My opinion is that MapQuest is using the Open tactic primarily to attract or perhaps appeal to a new, younger audience. Many regard MapQuest as the Rand McNally (and I should know) of the online mapping world. Their users are commonly older, relatively unsavvy when it comes to computers, brand conscious and known to be faithful to the brands they support (but unfortunately, dying off). I think MapQuest’s “Open” strategy is an attempt to capture younger users, particularly those attracted to the lure of crowdsourcing, free map data (OSM) and open software. In some sense, MapQuest’s support of OSM is MapQuest using Botox to look younger.

Another reason for MapQuest’s use OSM is that it is an acknowledgement of the company’s need to support Patch and other AOL initiatives that are focused on Local Search. Clearly, it would be a benefit to local Patch sites if people could edit, update or augment maps of their community to reflect the street and road level changes that are missing in the map databases provided by NAVTEQ and Tele Atlas.

At the present, MQ is running both its sites supported by commercial vendors and the ones supported by OSM. I suspect that cost is not a factor in their decision, as this is a marketing strategy. At some point, it is possible that OSM will become of high enough quality to use for routing in the US, but I think that this will take a very long time to occur.

It has occurred to me that MQ may be wondering what level of accuracy is needed to provide a successful mapping and routing website that is accepted by consumers. After all, they have seen Google provide a sub-par database that does not seem to have negatively influenced the commercial success of their mapping and mapping related products. Perhaps MQ is testing the waters to see if they, too, could degrade the quality of their product and still be successful in the market. For example, the OSM map database is good where there are lots of people and contributors and not so good where there are not a lot of people or contributors. Since most users are in urban areas and OSM can be quite good in urban areas, could their data be used to please most, but not all, of the people who need routes for navigation? I think not, but MQ may be interested in taking a look at this issue from a very specific commercial viewpoint. After all, I am not sure that you will ever have a Patch page for Stull, Kansas.

I am trying to find out more information on the AOL funding of OSM and have sent for the information on applying for the MapQuest grants – just to see what kind of support they are interested in. I think they are looking for apps, but it will be quite interesting if they are looking at upgrading the data. On the other hand, my email went off over ten days ago, so maybe I don’t have the right stuff anyway.

One of the interesting issues for me is how often will MQ update the OSM data? According to the MQ Developer blog “We started out, as anyone else would, with the latest dump of the OSM Planet data. We grabbed OSM2PGSQL, a python script used for data conversion, and set ourselves up with PostgreSQL with postGIS extensions (a library that adds a lot of geographical functions and datatypes to PostgreSQL).” MQ, then, went through a number of complicated gyrations to process these data and it does not sound to me like they will be running a live version of OSM anytime soon.

So, how often will they update their version of the OSM database and take advantage of the contributions of their Patch or MapQuest Open users to OSM? Seems to me that the cycle might be slow and updates done only a few times a year. Will Patch users be satisfied with new updates on a quarterly basis? If not, the whole experiment may blow up in MapQuest’s face and the Botox may melt (unless they get a lot closer to OSM – or Cloudmade, perhaps).

At the present, it appears that the OSM content shown on Patch maps cannot be edited. What the user can do is to list an event and have the event appear as an overlay of the OSM base map. Given this set of circumstances, exactly what is the process that MapQuest will use to show the local changes that people want to make to the base map that appears on the Patch websites? I guess you would need to go to OSM, contribute your update and hope that your change survives their automatic edit-check algorithms and the actions of other map updaters who may think your edit is incorrect. Even if your correction survives, how long, do you suppose, it will take for that correction to get back to AOL for recompilation, pushing to a live site and then be available to Patch users? Maybe there is something to this association that has not yet been revealed?

So the obvious question here is how often will MapQuest update its Open site to reflect the updates of its users? Of course, this consideration raises the question of how often anybody online, or maybe even offline, updates their website or products with the most recent version of the map database available to them. I think the number is, at best, once a quarter and likely much worse than that for most services. (Of course the exception here is Google, since they seem to publish map changes almost daily in an attempt to make up for the fact that they cannot control their data. Even when they do make new changes, the new changes go missing from one day to the next anyway. Is it possible that Google Maps has borrowed the “I’m feeling Lucky” tab from their search bar and apply it to each new request for a map?)

This problem of updating navigation websites got me to thinking about the problem from a different perspective. I had believed that the problems we saw earlier this year, with the realignment of I-195 in Providence, Rhode Island not being displayed in a timely manner by any of the major online mapping providers, was caused more by the delays in compiling the navigation databases into runtime versions, than by an inability to capture the change data. As evidence of this, consider that NAVTEQ had the new alignment in their database released in January, but it took over six months for the sites that used their data (including their corporate website) to reflect these changes. Of course, OVI MAPS (NOKIA) still does not have the new alignment corrected and is merrily routing cars across the roads that no longer exist, or, at least, ones that are no longer open to vehicular traffic. So for OVI, ten months without renewing their navigation data seems to be the cycle that meets their customers’ needs. Really!

So How Does This Link To Tele Atlas?

Recently I wrote a complimentary blog about Tele Atlas and the benefits that the company should be accruing due to its use of MapShare and Active and Passive Community Input. I guess I am going to have to eat crow on that, because TA seems unable to apply the data its users are gathering to improve their own company’s products.

I probably did not tell you that a few months ago I ordered a very low cost TomTom unit from Amazon. It was on sale and offered lifetime free maps and lifetime free traffic, all for around a hundred bucks. Well, it appears that it is going to take the rest of my lifetime for me to get the changes that were made to roads months and months ago.

One of the first times I used this new TomTom unit, it tried to take me down a one-way street – but going in the wrong direction. In another case, it told me to turn left to continue my route. Unfortunately, I was on the correct street, but it takes a long leisurely loop to the left, which the TA database thinks is a left turn. Yes, by following the turn instruction, I turned off of the street that I needed to be on, as no left turn was required to continue on the planned route.

Now, most of you would not have been to upset about this, but I had just updated the TomTom unit to the latest, greatest map version available from TomTom for the U.S. that was released in September, 2010.

Let’s see, September is approximately 11 months after the changes in Providence occurred. Well, even with all that MapShare stuff that I wrote about and the examples that I showed relating how they used time-slicing to find the Providence changes, they actually seem to have missed the changes. Yep, the changes to I-195 are not in the most recent TomTom database released for its PNDs. How about that?

You know I could not resist inquiring, so I sent Pat McDevitt of TomTom an email asking how this was possible. In addition, I pointed out that the changes to Providence were not in the database used by the TomTom online Route Planner.

Pat responded as follows:

“The geometry updates to I-95 in Providence were not completed in time for the release of our June MultiNet product (highway geometry editing actually stopped in early May) but are included in the current September release. (I can send you a screenshot if you’d like proof!).”

“TomTom Route Planner should be updated with the September release of MultiNet in the course of Q4.”

I replied to Pat with this comment:

“Can I assume that the September North American Database released by TomTom for its PNDs was built on the June Multinet product and that this is why the Providence interchange is missing for the company’s latest PND update? Or is there some other problem?”

Pat replied:

“You are correct – we release the map database each quarter to our industry customers (and our own consumer division) and they, in turn, build their products based on that content according to their release schedule.”

“So the commercial map release you’re looking was almost certainly built with our June MultiNet product.”

It is hard to imagine that a company would invest in technology as potentially useful as MapShare and then not have the time or money to analyze the data in order to produce a comprehensively updated navigable map databases.

TomTom’s market cap is now approximately €1 billion. You do remember that they agreed to buy Tele Atlas in 2007 for approximately €2.9 billion? So today you could buy both companies for approximately one-third of what TomTom planned to pay for TA in 2007.

I presume that TomTom’s “decline in fortunes” has negatively impacted Tele Atlas and its ability to finance a comprehensive update its database. Will someone just buy Tele Atlas and fix it? Please? It has lots of promise, but not if TomTom’s management is unwilling or, perhaps, unable to finance the needed changes.

Now for a final thought. Since there is a significant lag between the content you find at online map sites and the content that is actually in the providers’ current versions of these databases, who wants to win the online derby by selling transactions from their up-to-date database and avoid the whole problem with these compilation cycles? I had thought that the answer was NAVTEQ, but maybe not.

Well, off to San Francisco in the morning (Monday) for the GPS Wireless 2010 meeting on Tuesday, followed by CTIA and the Locations and Beyond Summit on Wednesday. If I find out anything interesting, I’ll write. Sorry if any typos popped up in today’s blog, but it is now 1 AM and I need to get some sleep before my flight.

Click for our contact information

Bookmark and Share

Posted in Authority and mapping, CloudMade, crowdsourced map data, Data Sources, map compilation, Mapping, MapQuest, Mike Dobson, Navteq, Nokia Ovi Maps, openstreetmap, OSM, routing and navigation, Tele Atlas, TomTom

5 Responses

  1. Randy Meech

    Patch updates every minute; MapQuest every fifteen minutes.

    The AOL investment in OSM is for initiatives that improve data quality in the US. We’re hiring people to focus on this, and you can contact the team at open@mapquest.com.

    Hi Randy:

    Thanks for the comment, I appreciate your taking the time to respond. How about a little more detail (if you can do that without revealing any confidential information about your company)?

    Exactly what map information does Patch update every minute? Ditto for MapQuest every fifteen minutes. Do you know how long it took MapQuest to fully update Providence? Next, If the MapQuest update information is based on User Generated Content, will you be sharing that with OSM so that it migrates to the MapQuest Open sites, or will your users have to contribute the change information directly to OSM?

    Finally, I did receive an email this morning from Ant at MQ regarding the AOl investment. I was told that “Our goal with the fund is simple and twofold:

    – to increase the size, vibrancy, and self-sufficiency of the US OSM community

    – To improve the quality and coverage of the US OSM data through the growth and health of the US community”

    Thanks again,


  2. Antony Pegg

    Hello Mike,

    At the time of writing (10/4/2010), our OSM database does minutely updates from the OSM master. The “time to production” stats are (roughly) from the time you edit in OSM:
    – 10-15 minutes for the map tiles to update
    – 2-24 hours for the Search results to update
    – Next-day for changes to be reflected in Directions (i.e: Daily updates)

    This was our #1 priority on getting back from SOTM 2010.

    Hope that helps,

    Hi, Ant:

    First, please accept my apologies. Your comment, which was delivered two weeks ago, for some reason did not show up in my email and I did not realize it was here until I found it this morning to answer another comment.

    Thanks for the input, your comment is quite interesting.

    Since you are actually doing this updating, you have more insights than I on the process.

    What is the rationale for updating every minute (other than because you can)? What benefits accrue to this time slice rather than longer intervals? For example are you using rapid updating as a method of attempting to discover bogus updates. The reason I ask is that an area being actively edited at OSM might change every minute and the result would be that your tile would update each change and be constantly changing to the user. What happens when the changes are erroneous and not corrected on the OSM end of the process? Do you catch some of the more egregious errors in the updates to your routing engine?

    If you can speak about this without revealing any trade secrets or causing yourself any trouble with your company, please do so.

    Thanks again for your comment and my apologies for the delay.


  3. timbuktoo

    Some one already already bought Tele Atlas, Tom Tom.
    The reason that Mapquest is using OSM data is market leverage because the quality improves it might need a lot less commercial licences from suppliers and therefore have a greater return on investment.

    Hi, Timbuktoo:

    Thanks for your comment.

    See our Blog in 2007 and 2008 for several columns on TomTom purchasing Tele Atlas. Just in case I was not clear enough – other companies are now considering whether they would be advantaged by purchasing TA from TomTom.

    I doubt that the cost of licensing map databases is the issue prompting MapQuest to experiment with OSM data. If you take a look at the financial evidence, the online companies that provide map and navigation data under license from NAVTEQ or Tele Atlas pay a fair fee for these data, but not an exorbitant one. I believe that the cost of licensing these data are one of their lesser expenses.

    MapQuest is looking for a differentiator and it may be that they feel that over time OSM data will prove to be superior in accuracy to that provided by the commercial houses. Another possibility, one that I have mentioned before, is that they may be trying to find out just what level of data accuracy is “good enough” for online use. For most of these companies, their licenses to you, the end user, says the data is not fit for any use. If the public is willing to accept that canard, then what level of map data accuracy is “just good enough?” for these navigation companies?

    Thanks again for your comment.


  4. Mike Dobson

    Ant got back to me with answers to the questions I had asked on his recent comment on the blog about MapQuest on Boxtox. His answers are very interesting and I think you find them of great value.

    Ant – thanks again.


    What is the rationale for updating every minute (other than because you can)?

    (every 10 minutes, not per minute ) Actually, half of it really is “because we can” – testing out the infrastructure, working through technical issues, seeing what the effects are of doing that etc. I’ve always thought one of the biggest issues is how recent is it? If I find something wrong with a map, and I tell the map owner – why do I have to wait for months before the change is reflected? This, to me is one of the most awesome things about OSM – I can just go fix it! And then 10 minutes later…I can SEE it! There’s a new development down the road from my house. I added it to OSM in May. Last time I checked, in September, none of the commercial mapping portals (including my own) has as much as OSM did – and one of them was still a corn field.

    http://www.openstreetmap.org/?lat=40.10694&lon=-76.33543&zoom=17&layers=M or http://open.mapquest.co.uk/?le=t&hk=4-zlqvtSk1&vs=h

    feel free to check it out compared to other sites.

    Or to put it another way – I have a beautiful book of maps of the ancient world (because I am a serious Roman History buff) – it has an incredibly accurate map of the entire Via Appia, as it existed 2,000 years ago. It’s incredibly detailed, picking out the sites of many ancient landmarks, many of which do not exist today. I’m not going to use it to find my hotel if I go there on holiday in 2010. I’d much rather have an approximate location of the hotel (say, within a block) from a local.

    That was probably way more philosophical an answer than was required!

    What benefits accrue to this time slice rather than longer intervals?

    three benefits (in my mind):
    a) Near-Instant gratification on the part of the editor

    b) Less load & faster processing of smaller amounts of information

    c) Depending on how long those “longer intervals” are – more accurate maps for the next person who comes along – If you’re trying to send out invites for your wedding, it doesn’t help to wait weeks for an update to the location especially if you can fix it, and then send it out.

    For example are you using rapid updating as a method of attempting to discover bogus updates.

    Nope – interesting concept tho – will have to think about how that would / could work

    The reason I ask is that an area being actively edited at OSM might change every minute and the result would be that your tile would update each change and be constantly changing to the user. What happens when the changes are erroneous and not corrected on the OSM end of the process?

    Think of it like Wikipedia
    a) Assume good faith

    b) Vandalism can happen

    c) The community will correct and remove the vandal from the community

    Do you catch some of the more egregious errors in the updates to your routing engine?

    Yes. There’s stuff we filter out because otherwise everything would blow up, and there’s a pre-processor that we are basically expanding over time, to catch things

  5. Brian May

    Mike – I just discovered your blog – extremely interesting reading. You are covering lots of ground and providing deep insight into basemap and navigation mapping.

    As for MapQuest, I find it very refreshing they are opening up to OSM, dedicating resources and moving quickly in embracing it. I just reintroduced myself to OSM and now I “get it”. I believe it will be a dominant player in the future of mapping. For example, check out Orlando Florida to see what some busy OSMer’s can accomplish.

    As Ant says, the minutely updates help hook people, because you can see your edits incorporated into the main map in near-realtime.

    And the thing I wanted to point out is, if you go to MapQuest’s UK OpenStreetMap site, and search for a US address, it will find it and show OSM data. And the recent edits I made to OSM are showing up. Awesome! Keep up the good work MapQuest. Also, I really dig the super-high res LIDAR based DEM hillshading they have in Florida (but its not in the OSM tiles yet).


    Thanks for your comment. A high level of enthusiasm is something we seldom see in the mapping world, keep it up.