Exploring Local
Mike Dobson of TeleMapics on Local Search and All Things Geospatial

Rumors Run Rampant – MapQuest on the Outside – another map engine on the inside?

September 10th, 2014 by admin

I heard what I will call a “rumor” this morning, but suspect that it was a statement of fact. If I am wrong, I apologize in advance. As you know, there are shades of rumors, so I will add some color where I have additional information.

It appears that AOL has quietly begun the shutdown of parts of the MapQuest operation. A few weeks ago the announcement was made at MapQuest’s back-end operation hub for MapQuest located in Lancaster, PA. Some members of the engineering staff were let go then, more will be released in November, and the operation in Lancaster is scheduled to close in March of 2015. Perhaps most important here is the fact that Lancaster is the back-end mapping operation for MapQuest. One would think that if they were moving the engineering operation to Denver (where the rest of the MapQuest group operates) that they would have moved the engineering team there, as well. It is my opinion that AOL intends to contract with another service to provide the mapping engine for MapQuest. Well, whatever the case, this is a rumor, but, I think, specific timing notwithstanding, the strategy of the story has been in the oven at AOL for some quite some time.

As some of you may know, MapQuest was once the leading provider of online maps and routes. Its historical trail involved a number of companies headed by Barry Glick and culminated in the property that eventually became MapQuest being acquired by Donnelley Cartographic Services, an organization that made maps for print publishers, but was wise enough to see the future of online mapping. Though the future was unclear, Barry and his successors navigated the road ahead and took MapQuest to a successful IPO, followed by the acquisition of the company by AOL.

MapQuest was the King of the Road in online mapping until it began to encounter a headwind from Google Maps. Of course, there were other earlier competitors than Google, but one-by-one these pretenders became irrelevant, fell into decline and ceased operations. The few that survived continue in business, but remain minor footnotes in the market.

It is somewhat interesting to note that the demographic that was attracted to MapQuest on the Internet was an older than average, mature audience. It is thought by many that the original audience for MapQuest continues with the service even today, with some of those loyal customers still printing out routing instructions rather than using route guidance through smart phones or other personal navigation devices.

The problem that nagged MapQuest’s planned IPO was a lack of revenue. Suffering from a real world case of the “Innovator’s Dilemma” MapQuest was a product that no one requested. When launched it was a “give-away”, a status that it could not escape once the genie was out of the bottle. Indeed, the numerous map-making companies littering the roadsides today are a result of “free-Internet maps.” Unfortunately, while MapQuest was able to overcome the perception of the “operating at a loss,” problem by up-selling its popularity when the stock market for Internet properties was “incandescent”, the problem did not disappear. The lack of revenue issue was knowingly acquired by AOL, whose executives were sure they could monetize the product line. Unfortunately, the strategies they implemented to cure the revenue problem failed and, when altered, failed again.

Those of you who read this blog may have noted that on several occasions I have indicated that Google is in the advertising business and that mapping, a side-line, was integrated into their strategy as a method of selling more advertising, especially location-based advertising. While MapQuest tried the advertising gambit, even Google advertising, it could not generate enough revenue to cover operating expenses. It’s likely that even Google has made its investment in mapping with little hope of recovering its map compilation and serving expenses. However, its map base provides advantages to the company in advertising and beyond that prompt it to continue its massive investment, at least for a while.

The important note here is that online maps have produced a state of disequilibrium in the market for online maps– one in which the revenue results not from the sale or the use of maps, but results from the advantages that maps can bring to other product lines over long periods of time. I think you all know the budget battles that must ensue at Google about who is paying for what and why this new mapping initiative deserves to be funded. If Google has not had these arguments yet, I guarantee you that they will in the future

What we are left with is an unbalanced market where Google, HERE (Nokia) and Apple will remain the major players. I suspect that AOL will replace MapQuest with either HERE maps or Google Maps, but in any event, MapQuest, an American original, will be soon be no more than a shell of its past glories.

My hat is off to the stalwarts that created, popularized and polished MapQuest. You did your profession and your company a great service. AOL? Well, they never seemed to understand mapping, or the use of spatial data. Perhaps more importantly, it appears that the executives did not understand how to manage MapQuest to success. I understand that both MapQuest and AOL Search report to the AOL Chief Analytics Officer. I am sure that monetizing spatial data is not one of his competencies. After all, map data and map engines are generic – at least to those who know nothing about either!

It is likely that switching the MapQuest engine is “merely” a matter of expense for AOL. Too bad they did not see the promise of MapQuest, but “buyer’s remorse” is a terrible thing and usually leads, as it did in this case, to limited investment for new product development. Speaking of “buyer’s remorse” it is an issue that may be endemic in the mapping industry, as HERE continues to muddle making a success of the former Navteq and TomTom is rumored to be in a dither about mapping expenses from the former Tele Atlas.

One more thing – there are some highly talented software engineers from MapQuest now available. I found a few of them on LinkedIn – take a look if interested.

I hope your Labor Day Holiday was relaxing and rewarding,

Dr. Mike

Bookmark and Share

Posted in Apple, Geospatial, Google, Google maps, HERE Maps, MapQuest, Mapping, Mike Dobson, Navteq, Nokia, Personal Navigation, Tele Atlas, TomTom, local search advertising, map compilation | 2 Comments »

Hear HERE – “Halbherr is Not Here Anymore.”

August 21st, 2014 by admin

You know, sometimes you just get a feeling about things and I have that feeling about HERE, so I wrote this blog. I do not have powerful data to support my opinions, but my time in mapping tells me that my feelings on the issues are right, or as close to right as one can be without having inside information – which I do not have. This is not a “How to win friends and attract clients” topic, but lets have a go anyway.

Yesterday I read an article detailing the departure of Michael Halbherr who was the head of HERE, Nokia’s mapping division.

I chortled to myself when the reporter of the article noted that HERE, according to nameless analysts, “…could fetch around $6 billion if Nokia decided to sell the unit.” What the reported did not mention is the Nokia paid $8.1 billion for NAVTEQ (now HERE) in 2008. Presuming a conservative, cumulative rate of inflation between then and now, the purchase price in 2014 dollars would represent approximately $9 billion. Through superior management, groundbreaking strategic planning, and the reduction of budgets for compiling map data bases, Nokia has managed to lop off $3 billion in value in 6 years. In addition, some rights related to the use of HERE’s map data were provided to Microsoft as part of the interestingly engineered purchase of Nokia’s handset division. Whether a potential acquirer of HERE would pay $6 billion might depend on confidential aspects of the grant of rights that Microsoft negotiated with Nokia regarding their future use of HERE data. Guess this means that the loss might be even greater than $3 billion after the bids fly.

According to the Time’s article, Halbherr, who reportedly had clashed with Nokia’s new Chief Executive Rajeev Suri, was leaving the company “…to pursue his own ‘entrepreneurial interests outside the company’.” The departure should generate a doozy of a non-disclosure, non-compete agreement. Imagine that, leaving a unit that could be the target for an acquisition to pursue your own entrepreneurial interests. I think I would have waited to see what happened – unless of course nothing was going to happen. On the other hand, I might leave if I was interested in joining a group that was interested in buying HERE – especially if the potential acquiring entity was a financial buyer (more later).

I am sure that Halbherr was frustrated with Nokia’s management. I am equally certain that Nokia was frustrated with Halbherr’s lack of strategic focus.

Let’s look at the situation from a few perspectives.

2007 – 2012

While thinking about Halbherr’s departure I noted that I had written some sage words about the deal at the time that it happened. In a 2007 report I opined that the acquisition would create new competitors in the mapping industry. The reasons for my conclusion were:

    1. The deal (along with the acquisition of Tele Atlas by TomTom) would create uncertainty surrounding the supply of data in various industry segments (Automotive, Mobile and Online).
    2. PNDs (personal navigation devices such as Garmin or TomTom units) exploded into the market in 2005-2007, but it was likely that the volume sold would rapidly decline in the face of the migration of navigation and mapping to the smartphone.
    3. The deal would generate pricing concerns in the industry due to the consolidation of map database suppliers.
    4. Other strategic considerations surrounding the acquisition might reduce NAVTEQ’s effectiveness in the marketplace (competition, brand management, integration with Nokia, etc.)
    5. A belief that new technologies and new approaches to map compilation might lower the cost of map data collection.

I note, in retrospect, that all of these events transpired and none of the outcomes were beneficial to Nokia.

My understanding is that life in NAVTEQ-land came to a standstill after the acquisition. Nokia botched the integration and then fired many of Navteq’s managers who understood both mapping and the target industries that could have been further exploited by Nokia. Examples of marketing errors abound, but let’s not forget one of my personal favorites – OVI Maps

Several companies integrated new technologies to reduce the cost of data collection. Google developed its mobile location pod that generated Street View and numerous other pieces of data that could be used to reduce the cost of map compilation, while expanding the currentness and breadth of the data that was needed to create a modern mapping system. Crowdsourcing changed the face of the industry and its economics, but Navteq was slow to embrace it, and even slower to find a way to adequately compete with Google’s fleet of low-cost map data collection vehicles.

PNDs took a calamitous nose dive as expected, but Nokia was not quick enough in integrating mapping applications into its phone-based ecosystem.

2012 – 2014

Given the declining performance of HERE in the online market and its seeming inability to keep competitors out of its niche in the dashboard and bus of motor vehicles, it is my opinion that HERE’s main weakness is (and has for some time been) underfunding the expansion and continued development of its map/navigation database.

My belief is that HERE has grown into a marketing driven data company that wears blinders. The industry reports are that HERE does not listen to its customers. Furthermore, the company has been slow to produce the data and innovative services that its customers need.

Lastly, HERE has been seen by Nokia’s management as an embarrassment to, as well as a leaderless entity within, the Nokia family of companies.

Future Issues

I doubt that Nokia can afford to compete in the mapping wars with Google and Apple.

The last potentially meaningful data we have on Navteq’s database development and delivery was that it spent $273 million in the first nine months of 2007 in the pursuit of a better database. At that time the amount that NAVTEQ spent during a partial year far outstripped the combined expenses of all of its competitors for several years. I doubt that HERE is now spending at a comparable rate to maintain and expand its database.

While the amounts that Google is spending to expand its mapping/navigation/location databases may not be sustainable in the long run, the large investments in mapping that they continue to be willing to make have put them far ahead of anyone in the mapping arms race. Perhaps the more important issue is that few potential acquirers will understand that the amount of money HERE is spending on database development is grossly inadequate to grow the business in the mapping/navigation/location marketplace or to compete with Google or Apple in the future.

Buyer, buyer, who has the buyer?

First, I am not sure that there are many strategic buyers who would be interested in HERE.

I have heard from my contacts that Samsung was once interested, but its recent financial difficulties seem to preclude taking such a step. Microsoft could be a potential buyer, but seems to be looking inward and may have already made a bad investment in Nokia that could sour future interest in another Nokia-related acquisition. Certainly it is possible that some strategic investor may try to low-ball a bid for an entity that Nokia no longer wants, but I am not sure any company that might make such a bid will have either the wherewithal to manage the company to success or the bankroll to make it competitive.

Intel tried to get in to mapping/location with its acquisition of Telmap, but has already closed that company and taken another approach to the location market. Other potential buyers may be out there, perhaps in the form of companies in the mobile phone ecosystem.

The alternative scenario is that a financial buyer will see HERE as an opportunity to make beaucoup bucks (esoteric financial term). What a horrible mistake that would be for them and HERE!

Strategic buyers often find it necessary to reduce the expenses of an acquired company to make it look like an acquisition is working to generate new profits, but in the case of the underfunded HERE, this could be a disaster. Financial buyers usually take the same approach, but often complicate the situation by adding a layer of management representing the values of the financial investors. Often, the new CEO will be a person who met a principal of the new owners while playing golf. The financial manager will note that this potential CEO once used a folded map, said that they really like Google Maps and Street View. He or she will have been the head of marketing for a consumer products company.

Oh, my head already hurts for HERE. The common approach of financial buyers is that they will flip rather than fix companies that underperform. It appears that turning a company around degrades the time value of money too much. My guess is that it would take at least three-years to transform HERE into the formidable machine that was once NAVTEQ.

The sales pitch during the acquisition will be, “We don’t know anything about your business (true), and we will not try to manage it for you (false).” They may not know about HERE’s business, but their bid will be based on how well they think the HERE business could/should perform and how far they think they can beat the fat out of its management based on expense considerations, not strategic considerations. And when the business does not perform well after these reductions, they will cut the budget again – and, at that point, it will become abundantly clear that the acquirer knew little about the business of HERE.

Of course, in this case such a lack of experience is to be expected. Who does know much about running a map and navigation business nowadays? Neither Google nor Apple are independent mapping companies. They are companies that forward integrated into mapping to expand their core businesses. Perhaps this is the road that needs to be taken with HERE, although this is the failed road that Nokia took in 2007.

Conclusion

After I finished reading the Halbherr resignation article I laughed once again, but this time at the irony of the whole thing. After all, the logical endpoint of Nokia crashing HERE is a duopoly in mapping of Google and Apple. Imagine that – in 2007 the duopoly fears focused on NAVTEQ and Tele Atlas going to Nokia and TomTom, companies that would have an insurmountable lead in the world of mapping. So much for insurmountable leads. TomTom…TeleAtlas…. Don’t even ask!

By the way, I have been working on a piece about symbolizing maps for those lacking a background in the language of maps. Every time I pick it up I realize that it would take a book to do the topic justice, but I do not want to write a book. So I start rewriting. Someday I hope to finish the article, but it’s a pain trying to find the right approach and tell the story concisely. Sorry for the delay.

One more thing – it’s late (about 2:15 AM PDT) and I apologize for any typos in this blog. My eyes are just too tired to “see” them.

Until next time.

Dr. Mike

Bookmark and Share

Posted in Apple, Garmin, Google, HERE Maps, Mapping, Microsoft, Mike Dobson, Navteq, Nokia, crowdsourced map data, map compilation | Comments Off

eParking – Some Things You Need to Know

May 21st, 2014 by admin

Slightly over a year ago I had two, brief, unplanned, interesting exposures to the world of eParking services. First, a former colleague at go2 Systems asked me to meet with the CEO and co-founder of ParkMe. As is usual for these types of invitations, I conducted serious research time on the company, its personnel and their business model. Subsequently I met with members of the management team of ParkMe to explore opportunities, but did no work for them. Several months later I was asked to and conducted a brief study of another eParking business in which a former colleague from Rand McNally had invested. The target company of that examination was ParkWhiz.

To be honest, before I had visited these companies I was not particularly interested in the “parking” space. Due to these two chance encounters I came to realize that parking services represent an interesting market whose optimization may be a possible key to reducing traffic congestion, while providing a much needed improvement in customer satisfaction in directed, automobile-based travel situations.

It is my opinion that the unexpected encounters with ParkMe and ParkWhiz exposed me to two of the leading competitors in this market, whose strategies, though significantly different, typify the distinction in the eParking market. Both companies approach eParking as a national market, although in ParkMe’s case, they collect international data (and claim to include Antarctica). Strong competitors in the eParking space include SpotHero, Parking Panda, Gotta Park, ParkHub, Click and Park, as well as numerous others. My interest here, however, is not to discuss the merits of any specific company in this space, but merely to recommend that you spend some time looking at this market, as I think that parking services in the near-future will become a must have-feature for companies providing navigation and routing services (e.g. Google, Apple, Nokia, TomTom).

Some Details on eParking

1. The size of the parking industry has been estimated at $30 billion. As might be expected there is an autocorrelation with population. According to research by the National Parking Association* Five states (California, New York, Texas, Florida and Illinois) generate slightly more than half of the total parking revenues from lots/garages. In addition, the states mentioned include at least half of the parking facilities in the nation. The leading segments providing parking are commercial/owner-operator facilities, colleges and universities, hospitals, municipalities and retail/shopping centers. For practical purposes the market can be further broken down into on-street and off-street (lots/structure) parking. The market for off-street parking is approximately twice as large as the on-street market.

2. Parking has remained a local product and many owner operators do not advertise their service. Instead, they rely on local knowledge of their business to attract steady customers (weekly, monthly) and rely on location and access to nearby facilities to “capitalize” on transient customers (hourly), such as that generated by service or shopping oriented businesses.

3. The eParking world appears divided in terms to its approach to monetizing the world of parking. Some companies are attempting to build large inventories of data on parking facilities (even at the street-space level) including attributes such as location, hours of operation, spaces available, cost, etc. These companies appear to be focused on becoming the “premier” supplier of parking data to the navigation industry, believing that the addition of data on parking is a natural extension of navigation and routing systems. (Note that several of the “data” companies have felt the heat from the next category of providers I describe and have affiliated with some of these companies to provide services that expand beyond data.)

Other companies, while amassing large, detailed, databases of parking data view that information as a component of a service business primarily designed to allow users to book (reserve) parking ahead of time, for example while on the road as part of a journey. Companies using this strategy see eParking as both a service business and data licensing enterprise. Drivers can use this type of service to save time, money and gain peace of mind when they need to find a parking space near their destination. In addition, the participating parking owner/operators may benefit from this association through improved inventory management and branding.

It is my opinion that this latter class of competitor wants to influence the distribution of parking information on a just-in-time basis. This segment of the eParking space hopes to serve the parking lot owner/operator by managing their parking spaces in a manner that reflects demand propagated by exposure to a broader audience of potential customers than could be generated by the parking enterprise acting alone. In addition, those playing this intermediary-role could provide valuable services to the lot owner, in addition to the obvious advantages in yield management for selling inventory. In essence, the players in this transactional segment of the eParking market want to become integrators providing value added services that make customers of: drivers needing parking services, owner/operators of parking facilities needing to fill parking spaces, and navigation services providers looking to build offerings integrating new, spatially-targeted advertising opportunities.

4. Parking services, while considered a national market, operate mainly as a local business and strong local competitors exist. The same divisions are true in eParking. However, the end-game for most participants in eParking is acquisition by a company that could benefit from owning a parking data provider or parking services provider. It is in this sense that national data and distribution may better position players in the eParking segment for the ultimate end-game.

5. Compiling data on parking spaces, lots, garages and other facilities is a difficult task. All companies in eParking use specific forms of data compilation and many use hybrid methods that combine aspects of the hands-on and hands-off approaches. Which techniques used usually depend on the nuances of specific markets.

Hands-off techniques view the compilation task solely as a data gathering operation. Usually field teams (often stringers) canvass an area, gather visible information from inspection (address (and other contact information), signage, costs, capacity, etc.) and take photos of the location for later data mining.

The hands-on approach often encompasses the above actions, as well as site visits to determine attributes not immediately visible from the street. In addition, these visits usually include a dialog with the owner operator about the specifics of their facility, as well as discussion concerning the integration methods for representing the parking inventory in an online reservation system.

6. Some players in the eParking market actively partner with automated, real-time municipal parking reporting systems, since doing so allows the end-user of the service to determine how many parking spaces are available at a facility in near-real-time. Coupling this knowledge with pricing makes an effective tool for consumers looking for economical parking availability throughout the day (as parking spaces are often day-parted, increasing in cost at the most congested hours).

7. In general, companies in the eParking space do not provide their own mapping service. Google appears to be the preferred provider (and, perhaps, the preferred acquirer) of many, but not all, of the companies mentioned in this article.

Navigation – Why Parking Information is an Outstanding Need

1. Donald Shoup, a noted researcher in the field of traffic studies, indicated that, on the average, people cruising for a parking space account for a thirty-percent share of local traffic.** If Shoup’s finding is true, the amounts of wasted time, gasoline, pollution and frustration are reasons enough to want to solve the “parking” problem. Of course, dire factoids rarely convince anyone of anything, so let’s think some more about the eParking opportunity.

2. If you were to ask my opinion on an “ideal” navigation system, it would be one that solicited my intended destination, and, then, suggested nearby parking for my consideration, before routing me to my parking preference closest to my intended destination. Yeah, that’s right – how many places do you navigate to and not park?

I usually arrive at a destination and then spend too much time figuring out where I can leave my car without getting a ticket – or trying to create path back to the address (location) of my destination from where I parked my car.
Like everybody else, I use routing to find locations I have not visited before. If I am traveling in a suburban area and do not know where the address is located, it’s almost certain that I do not know where nearby parking is located, but I always assume that street parking should be available. Usually I arrive and then drive around for a while trying to find a parking place and relent and choose a parking lot or garage when the no-cost option fails. For locations in cities (and when on business) I don’t even consider street parking and visually hunt for the parking garage closest to my destination and take a space, if the cost is not stratospheric. If it is and, if I have time, I may continue my hunt for lower priced parking.

I do not own a PND or know of a routing application that offers me truly integrated parking services – that is the option to navigate to the closest parking to the address I have entered as a destination. I am not saying that systems cannot route me to parking lots whose address information I could find near a destination by searching the map or searching for a parking lot near a destination using one of the companies mentioned above. However, I want a eParking reservation service that provides information on the parking available (rates, etc.) near my destination integrated into a navigation engine so that I can enter my destination, choose my parking option (reserve/pay), and be routed to the parking facility. I, also, want a walking map from the parking garage to my destination, and I want that route to describe restaurants, points of interest and other information (even ads) that might be of use to me as I pass by on my way to my end-destination.

I realize that I could go to one of the parking services mentioned here, enter an address and get a map showing the parking details, but I want a route to the parking location, integrated with traffic, and other query capabilities. If Google, Apple, or Here is planning to provide this type of service, then it is likely that they will acquire one or more of the companies mentioned above to help them with this challenge.

3. I presume that I am not the only poor, lost soul looking for local parking and hoping that some major mapping/navigation/routing player integrates eParking services into their offerings.

4. Based on my personal experience Evanston, Illinois takes the cake when it comes to “Cruising for Parking”. Hmm – sounds like the title of a zippy new reality series. You read it here first!

If you have time off this Memorial Day weekend, I hope you enjoy it.

Mike

*Search for “Parking in Perspective: The Size and Scope of Parking in America”
**Shoup, Donald C., 2006, Cruising for Parking, Elsevier,Transport Policy 13 (2006) 479-486

Bookmark and Share

Posted in Data Sources, Geospatial, Geotargeting, Google maps, Local Search, Mike Dobson, Navteq, Nokia, TomTom, eParking, landmarks and navigation, local search advertising, map compilation | 3 Comments »

Google Maps and Search – Just what is that red line showing?

April 24th, 2014 by admin

In an earlier blog in this series I contemplated a future sea-level change in online mapping that would develop as an adjunct to the popular mapping systems that are provided by Google, HERE, Bing, Apple and others. These database mapping systems currently are mainly oriented towards providing detailed street-level coverage, since this information meets the fundamental needs of users for geo-search and navigation.

Most mapping products are designed to meet the needs of map providers for generating income. For instance, HERE generates income by providing mapping databases and software that cater to the in-car navigation markets, as well as to ADAS, and other systems designed to make car travel safer and more efficient. Google, on the other hand, generates significant income by integrating its mapping activities into various aspects of its complex system of advertising. In addition, Google is obviously interested in other markets for spatial data, such as those focused on GIS and intelligent/autonomous cars.

All of the companies mentioned above, also, have users/customers interested in viewing maps that tell stories by showing aspects of geography or geographical aspects of a company’s services. For example, the American Airlines map that was shown in the original article in this series was an example of American Airlines attempting to show its global reach using online maps as the story telling device.

Recently, the online mapping providers mentioned above have begun attempting to increase the functionality of their mapping systems by providing data that allows the generation of “quasi-reference maps” whose objectives appear to be similar in approach to those formerly popular as printed world atlas products. It is my opinion that attempts to create a dual or multipurpose purpose production mapping system provisioned with the capability to publish both detailed street maps and world reference maps have been less than impressive. In part, this is due to these providers’ lack of familiarity with the intricacies of supporting the objectives, methods and presentational formats required to publish a wide range of mapping products from an integrated spatial database (e.g. street, reference and thematic maps).

In my opinion Google has made the most progress on dual-use maps and evidences a lot of promise for continued innovation. Even so, what they have created for us often does not make sense. Let’s look at one simple, fundamental object –geopolitical borders.

See the figures below for common examples of the complexity of harmonizing and generalizing multiple-purpose, multiple-source data bases. In these examples we can see that there are multiple representations of a feature in a database that has been designed to allow a variety of zoom levels. Each of the images was generated by typing a state or country name into the search bar that is a part of the Google Maps interface.

The targets of the searches are returned as a red outline and this appears to be true whether you are searching for countries, state, counties, cities or other categories of political or, where available, postal geography. These representations of boundaries symbolized in red first appear in a small scale representation encompassing the geography of the unit that was searched. Initially it appeared to me that each of these border representations disappeared at a preset zoom level that showed more map detail than the initial view, but the levels at which the boundaries symbolized in red disappeared (or at least segments of these borders disappeared) seemed to change both within and between classes of boundaries (e.g., international, state, county, city, etc.). Next. the levels at which the red lines on Google Maps disappear may be influenced by factors such as browser type and screen resolution, although I did not experiment specifically with these elements.

Let’s search for “Canada” as an example of using the place name search functionality in Google Maps.

Result of Canada Search on Google Maps

For a larger version of this image, click here

Yes, we are presented with a representation of Canada on which a red line demarcates a boundary.

Let’s search for the United States.

Results for a search for the US when using Google Maps

Hmm, no red border. Maybe this omission is a geolocation feature? If so, Google should note that a recent piece of research suggested that among Americans living in the US, some thought that the Ukraine was located within the US border. Perhaps showing the red border when someone searched for the U.S., similar to what is shown for other countries, might be useful. On the other hand, as we shall see, there is some question as to the nature of the “geography” that is actually being represented by the red outlines returned by Google and symbolized on their maps.

Look at the screenshots below and evaluate if any of these examples would inform someone who knew little geography and wanted to use Google maps as a reference source to help them understand the location of geographical borders.

Let’s replicate someone searching for the entity “California.”

California Search Result

To see a wider view, click here

What’s that dent in the northern border of California at Goose Lake? I didn’t know that Goose Lake was not part of California.

What specific border quality is that red line showing? If you look closely at the Goose Lake you can see a gray dash symbolized across the Lake that seems to follow the border between the two states as represented on official highway maps.

Let’s zoom in.

search

Hmm, guess I zoomed a little too much since the red line disappeared. However, what is shown appears to be one representation of the CA border that is similar to the one shown on the official state highway map. So what was that red line in the previous image?

Let’s zoom back out to bring back the red line.

Search and zoom out

It would appear that the red line, in this case, may demarcate the “land” boundary of California.

Let search for “Oregon” for support.

Oregon Search

Here Goose Lake is shown as excluded from Oregon by the red line.

Okay so it looks like the red line is not a standard political border, but the land-water boundary that follows the land side of the border of the entity named. Yep, searched for “California” again and Lake Tahoe was excluded by the red border. When I searched for “Nevada,” Lake Tahoe was not shown as part of NV.

Gee, that’s great, but where does Google tell the casual user the quality that the red line represents? In a page I found on Google Map legends the only reference to “red boundaries” was a note that “disputed” international borders were shown in red. However, I am not sure that the page I examined was authoritative, although it appeared on one of Google’s URLs. I doubt that the casual user would have any idea where to find out this sort of information. Indeed, I suspect that the casual user may, for example, search for California and conclude that Goose Lake, Lake Tahoe and various other water bodies are simply not part of California.

Oh, one other thing. It appears that the red boundaries are shown when zooming down to the level at which another feature takes precedence and replaces the red line. In essence, the occurrence of the representation of the red line is variable and tied to the local geography represented in the view port. For example, at the 20 mile zoom map the Oregon Boundary is still red.

Oregon boundary zoom

For a larger version, click here

But at the 10 mile zoom level the red is replaced by a gray dash that disappears when a more dominant feature in Google’s display hierarchy, such as a state or county road, is coincident with the border.

10 mile

For a larger version of this image, click here

This would seem to indicate that Google has implemented scale-variable presentation, a neat trick, but one that may make it difficult for the user searching for an entity and examining it at variable scales. However, when I searched fthe borders of a few more states, I ran into additional situations that further clouded the identity of Google’s red line.

Here is a good example. Let’s search for “Maryland.”

Maryland

For a larger version, click here

Oh, my – Look at those straight segments of red line in Chesapeake Bay. I am not sure what representational logic applies here since the red line no longer appears to be demarcating the same “landed-ness” quality that it appeared to demarcate in the maps of California, Oregon or Nevada.

Let’s take a closer look.

Zoom

Yowee! How does this red line relate to the others we have viewed?

If I look at Google Earth’s coverage of Maryland, at a similar scale, I see the same border in white. Hmmm, this is getting more confusing.

Maryland in Google Earth

Well, at this juncture I was thoroughly confused about the “identity” of the Google’s red line. I decided that I needed more data and returned to the start – Canada, one of my favorite places. So, I searched Google Maps for Canada, zoomed in, panned around, and found more interesting, but not reference quality data. Take a look at this –

null

For a larger version, click here

When I zoomed further most of the red border disappeared, but not all of it, a new red line behavior that I had not seen in our previous examples.

Zoom

For a larger image, click here

How about that! What condition is the red line now representing? Hang on, it’s about to get stranger.

Even Stranger

To see a broader coverage extent, click here

I am not sure I understand anything about the red line now. In addition, I note that it does not appear to be coincident with some of the islands and other coastlines that it is supposed to follow. Who is editing this crap?

My head hurts

For a wider extent, click here

Look, more mismatches. Why the red line looks like it might be…oh DCW (Digital Chart of the World) or something similar. Anyway, maybe it’s just old, generalized data created from a spatial database designed for other circumstances. I guess this means that Google’s multiple representations of the same feature set have not been harmonized.

For a larger version, click here

Maybe the data on which the red line is based is so old that climate change has altered sea level or the extent of the icepack?
See this image

For a wider extent, click here

Well, regardless of the physical reality, it’s pretty sloppy representational work for an aspiring reference publisher.

I am sure that there is some reasonable explanation of the rationale for showing whatever it is that Google is showing with the red line. What the examples shown here point out is that it is difficult to compile a cartographic database and provision it with the diverse types of content needed to provide the range of data types and data elements required for presentation in a system that maps both detailed street data and more generalized regional or world reference information. It is, perhaps, even more difficult to harmonize (in terms of currentness and theme) and generalize these unique types of content when they are to be used in systems that provide output for numerous and wide-ranging map scales.

It is likely that all companies attempting this evolution from street map to reference map will run into numerous and substantive data quality problems. Two critical data quality issues that are clearly a problem for Google as evidenced by the above images are logical consistency (issues related to the correctness of the characteristics of a data set) and thematic accuracy (particularly in respect to non-quantitative attribute correctness and classification correctness). Unfortunately, this is just the tip of the data quality iceberg that Google and others are facing. It is the user and geographic literacy that suffer such attempts at experimentation.

From the perspective of the user, accepting the messages from these maps will depend on whether or not the spatial data is authoritative, coherently presented and understandable. We asked a simple and basic geography question and Google failed. For a company that has as a goal creating a perfect map of the world – well, they seem to have a very long road ahead.

As a final note, the question of revenue generation always is an issue with the production of maps. The red borders that Google shows may be as confusing as they appear to be because people who search for map borders of countries and states may not reflect the company’s financial interests in geolocation and navigation, which bring in lots of advertising revenue. Local borders are clearly more important. Why just look at this coherent border for Charleston, South Carolina.

Wow

For a larger version, click here

You may have noted that on some of the illustrations linked to above (the “larger” illustrations hidden unless you click the link) the search tab often contains a sub-tab labelled “terrain.” If you click the “terrain” tab in the live version of a Google Maps search, the system will show you a version of the map with terrain shading and the correct geopolitical border for the entity searched. If only the red lines that Google presents when you search for a geographic entity showed the same borders.

Well, it seems that Google needs help. Send your border info to “wherezit@.” It doesn’t matter where you send your data because Google or the NSA will be able to find it. Of course, this brings us back to authoritative data, trusted data and the whole conundrum we discussed years ago. It makes me feel good knowing that my blogs are “timeless”. Hah!

Duane Marble would prefer my blogs to be “Typo-less”, but I would miss his caustic notes containing edits, so no go.

Until next time,

Dr. Mike

Bookmark and Share

Posted in Apple, Authority and mapping, Bing maps, Categorization, Geospatial, Google, Google maps, HERE Maps, MapQuest, Mapping, Microsoft, Mike Dobson, Navteq, Nokia, Technology, map compilation, map updating | Comments Off

Does Anyone Need to Know Anything About Maps Anymore (2)?

February 20th, 2014 by admin

(This is NOT the blog I had planned next for the series, but it is one that may help clarify why this topic is of such significance. If you were not wild about the last blog, you might skip this one.)

In a comment on my last blog regarding cartographic knowledge, Pat McDevitt, VP of Engineering at AOL, formerly with MapQuest, TomTom and Tele Atlas, mentioned his interest in “map-like-graphics”, such as subway maps (see my response to his comment for more detail). In the 1980s, Barbara Bartz Petchenik coined a term for such displays by naming them “map-like-objects”, or MLOs. MLOs sacrifice some aspect of cartographic accuracy to promote easier understanding and use by a selected population. Let’s explore this concept a bit, as a discussion may help to further illustrate the points I was making in my last blog.

The class of MLOs that represent subway maps includes purpose-built graphics designed to help riders of these transportation systems understand how rail lines connect stations in a manner that can be used to plan journeys. Since the rider only can access and exit the trains at specific stops, the actual geometry of the network ( in terms of distance and direction) is of inferior importance to creating a display that is readable, interpretable and actionable in a manner that allows the user to ride between an origin and an intended destination. The argument here is that while MLOs may sacrifice cartographic accuracy, they are tools that can be more effective than using an accurate, detailed map of the same spatial objects. If only the use-case were so simple! Let’s explore by personal example.

I have visited London at least 20 times during the course of my adult life. I usually explore the city riding the London Underground to travel to a location near my planned destination. I admit, with some shame, that of all the urban geographies I have explored I know London’s geography the least well. I find this curious since this location is one of my favorite travel destinations. It is, also, a destination I have visited more frequently than other urban areas that I seem to be able to navigate with little problem.

During my visits to London I was bothered that the objective reality I gained while walking its streets seemed to conflict with where I expected the city’s spatial features to be located. While I was certain that some time/space perturbation was afoot, I was not sure if popping out of the Underground’s “wormholes/tube stations” so distorted my mental map of London that it could not be remediated.

More recently I started exploring the notion that my ill-conceived geography of London actually was a result of traveling using the Underground. I realized, after some consideration of the issue, that my “relative mental spatial reference” for the locations of features of interest in London was likely based on where the nearest tube station was positioned. What is problematic here is that my sense of the geography of the tube stations was informed by the Tube map. Was it really possible that I had used my knowledge of where stations were shown on the ubiquitous Tube map to inform the reality of my above ground wanderings on my probable location? Sounds like science fiction, but could it be true?

To that point, my irrational view of London’s geography might be because the Tube map includes a variety of planned misrepresentations, which you can read about in the article What does London’s Tube Map Really Look Like? Of additional relevance is a study from 2011 by Zhan Guo called Mind the Map (a parody on Mind the Gap – signage familiar to all who have ridden the Tube). Gou concluded that 30 percent of passenger take longer routes because the Tube map misrepresents distances between stations. (You can read a concise version of Gou’s report in the Daily Mail.)

Based on this brief diversion we might conclude that while MLOs can be useful, they may be extremely misleading. Many would say that the problems generated by MLOs result from the users of these maps employing them for purposes for which they were not intended. If that is so, maybe these map-like-objects should come with a use warning, like those on the mirrors of some American cars – perhaps something like:

This map probably represents a spatial area that is considerably larger than this paper/display screen. True distances, directions and spatial context are not represented correctly or reliably. Reliance on this map for any use, even riding the Tube, is not recommended and may result in serious injury, lost time, exposure to buskers, or other inconveniences. The publisher and producer of this map and related licensees are not responsible for errors of omission, commission, or other misrepresentations resulting from lack of cartographic knowledge, incompetency, lack of moral fortitude regarding international border disputes, editorial policies, advertorial policies or, more commonly, frequent cost avoidance cutbacks in map compilation efforts.

While such warning might sound humorous (hopefully), the multiple use issue is of considerable concern. While those who create MLOs may realize the shortcomings of this type of spatial display, I am not sure this type of knowledge is known by users of the map. It is likely that a large proportion of the population that use MLOs will be unaware of the limitations that complicate extending the use environment that the original MLO was designed to allow. In some ways the problem is similar to that experienced by the twenty-six percent of U.S. citizens who, having observed the sky (or not) concluded that the sun revolves around the earth!

The problem of representing spatial reality in maps is extremely difficult. People who use maps do so in one of several manners, but all of these uses involve, to some extent, answering the question “where?” In many cases map use is data driven, prompting people to browse the map in a manner that helps them organize it into a familiar/understandable patterns.

To illustrate this case, imagine that you are viewing an election map displaying states that voted Republican (colored red) or Democrat (colored blue). Most people would explore this display by examining their home state, comparing other nearby states and then looking for the states that voted their preference, followed by those that supported the opposite side. The recollection that most people would have of this map is the patterns made by red and blue states and their spatial clustering across the extent of the map. Even the most cursory inspection of a map usually results in the acquisition of a pattern that is matched with other map patterns that users have acquired. The unfortunate complication here is that users do not know when they are observing an MLO that works well only for a selected purpose, or when they are observing a cartographic display that has been tightly controlled to produce a spatially accurate a representation of the variable being mapped.

Perhaps more pernicious is the hybrid MLO. The American Airlines map that I showed last time was designed to function as an MLO, but was based on a highly accurate cartographic display. In addition, the map was created by a production system that was designed to produce both reference and detailed street maps, but apparently not to produce advertisements or MLOs. Imagine teasing the cartographic reality out of that map. Someone who had not seen a world map before might assume that the globe really does look like what was shown in that display. Well, so what?

I recently read an interesting article by Henry Petroski titled “Impossible Points, Erroneous Walks,” (American Scientist March-April 2014 Volume 102, Number 2, available only by subscription) that was brought to my attention by Dr. Duane Marble shortly after I published my last blog. Petroski, a noted author (he is both a Professor of Civil Engineering and History at Duke University), was railing about an illustration in the New York Times that incorrectly represented the scallops on a sharpened pencil. His thoughts on the seriousness on this seemingly modest error were equally true of MLOs. He wrote:

Books, newspapers, and magazines are also teachers, as are television and radio and the web, as well as the inescapable advertisements. Whether or not we are consciously aware of it, the whole of our everyday experience is an ongoing teaching event.

This is why it is important that what we and our children are exposed to in the broader culture be presented and represented accurately. The words and images that we encounter in common spaces can be no less influential in shaping our perception of the world than what we learn in a formal classroom setting. If we find ourselves surrounded by incorrect depictions of objects, our sense of how things look and work can become so skewed that we lose some of our sense of reality.

Petroski continues:

This is not to say that there is no room for imagination and creativity in both engineering and artistic work. But even the most abstract of ideas and renderings of them should follow rules of geometry grammar and aesthetics that make them meaningful to venture capitalist and museum-goers alike.” (Petroski, 2014, P1-2 Impossible Walks, Erroneous Points).

There we have it. In the context of maps, we might substitute “But even the most abstract of spatial ideas and rendering them should follow the rules of cartography, map grammar and the design of displays representing spatial distributions…” That of course would return us to the title of my last blog, which was “Does Anyone Need to Know Anything About Maps Anymore?” Of course they should! Next time, let’s resume why this lack of cartographic insight will become a greater problem in the future of online mapping.

Thanks for visiting,

Dr. Mike

Bookmark and Share

Posted in Authority and mapping, Geospatial, Mapping, Mike Dobson, map compilation | 1 Comment »

Does Anyone Need to Know Anything About Maps Anymore?

February 10th, 2014 by admin

As many of you may have noticed, Exploring Local has not been updated recently. For the last year I have been engaged as an expert witness in an issue involving the type of subject matter that I usually comment on in Exploring Local. Due to the sensitivity of the proceedings, I decided not to
write any new blogs while I was engaged in the proceeding. Recently the matter concluded and I intend to focus some of my time on issues related to mapping and location-based services that are of interest to me and that I would like to share with you. So, here we go-

A few months ago I saw a blurb on my LinkedIn page about a debate that was going on regarding maps in a forum titled “GIS and Technological Innovation.” You can find the article and some of the comments here, in case you do not belong to LinkedIn.

I cringed at the pejorative title of the argument, which was, “Do Programmers Really Make the Best Cartographers?” While this is not quite as ill-phrased as, “Do Code Monkeys Really Make Better Maps than Wonky Map Makers?”, somehow the original title seemed to not quite set the right tone. The most problematic issue with the original question, at least for me, was the lack of context. For example, my interest in the comparison was, “When doing what?” In essence, was the original question designed to explore 1) who writes the best code for cartographic applications, or 2) who makes the best maps using available applications? In my opinion, both questions are non-productive.

Let’s substitute these questions instead. First, “Does anyone know how to “make” maps (or mapping software) that effectively communicates the spatial information they were designed to convey?” If someone does know how to do this, the question of interest then becomes, “Do mapping systems permit their users to exercise these capabilities?” A third important question is, “Does anyone compile the spatial data that fuel mapping systems in a manner that accurately reports these data as they exist in the real world?”

Now, for purposes of continuing this argument, let’s make an assumption though clearly not true, that all spatial databases are of equivalent quality. If we accept this position for purposes of exposition, then the next meaningful issue is, “Does the mapping system function to inform the reader of the spatial information it is designed to map in a manner that retains the fidelity of spatial relationships as they occur in the real world?” This leads us conceptually to a two-sided map-making platform; on one side we have the mapping functionality and on the other we have the actor who uses the functionality to prepare maps.

Analyzing the capabilities provided by software-based mapping programs will lead us to conclude that some level of cartographic practice has been embedded in all software systems designed to produce maps. I think we can agree that, the software mapping tools convey someone’s (or some development team’s) understanding, hopefully informed by cartographic knowledge, of the functional requirements of a mapping system. These requirements, for example might include consideration of the goals that use of the mapping tools should accomplish, how the tools should operate, how the desired capabilities of the tools might be formalized as functional software, and whether or not user input is allowed to modify the functionality in any meaningful way.

We should, also, acknowledge that some of the end-users of these systems may have knowledge of the cartographic process and seek to use these systems to create a map that melds the capabilities of the available software functionality modified by their personal experience with rendering spatial data. In practice, the use-situation is often constrained because many mapping applications, for example Bing Maps, Apple Maps, and Google Maps, are structured to meet a specific publishing goal that influences how the available software interacts with spatial data. While this potential limitation may influence how a person uses an online system to create maps other than those normally provided by the system, it does not teach away from the general tenet that knowledge of cartographic theory and practice should underlay how well maps function in communicating spatial information, regardless of who makes them or who creates the software functionality.

If software developers and modern cartographers have some degree of cartographic knowledge, where do they get it? Although there is a small (and declining) worldwide cadre of academic cartographers who continue to research improvements in the communication of spatial data using maps, there are just not that many people who benefit from or are even aware of these efforts. Conversely, even if the developer of an online mapping system has discovered accepted cartographic theory and practice and used it to shape the functionality of their software, the truth table is whether or not the use of its functionality can be harnessed to present data advantageously, that is in a manner that accurately represents the spatial data. I think that this is the critical question that pervades all modern map use. Restated, we might ask, “Are the capabilities that mapping systems offers us today based on mapping engines whose developers and users (map makers) have been adequately informed on cartographic theory and practice?”

My response to this question is mixed. For example, most online mapping systems appear to have been developed by people who understand the mathematics of map projections, although they appear not to appreciate the use-limitations of projection types. Conversely, most online systems seem to have been developed without a clear understanding of the complexities of data categorization, classification and symbolization.

If I could get the online mappers to listen to me I would plead for them to include the famous “That’s Stupid” functionality, which automatically erases your map when you have created an illogical presentation or one that is misleading due to errors in representation, symbolization, generalization, classification, technique, etc. Of course, if such functionality were ever implemented, there might be no online mapping whatsoever.

Laugh if you will, but take a look at this fine example of modern online mapping brought to us by American Airlines as part of a recent promotion urging people to travel on a worldwide basis. The map appears to have been created by Microsoft and it is copyright both by Nokia (HERE) and Microsoft (BING).

American Airlines, Microsoft and Nokia give you the world and more.

Click here for a larger version of this map.

You may have noticed that you have a choice of visiting any of the approximately twenty-seven, apparently non-unique, continents (one representation of Europe seems to have mysteriously disappeared into the seam at the right edge of the map and does not show up on the left continuation). The map is exquisitely crafted using shaded relief, although I suppose this could be a representation of the earth during a previous ice age since there are no countries shown, nor airports with which to fly American Airlines.

I am not certain of the distances involved on the map as there is no scale. Although we know that the equatorial circumference of the earth is, oh – a) 24,901 miles (Google), b) 24,859.82 miles (Geography-About.com), c) 25,000 miles (AstroAnswers), d) 24,902 (Lyberty.com), or e) 24,900 (dummies.com). Don’t even ask about the polar circumference! Well, some measurement must be appropriate, but which one applies to the map in question? Further, where does it apply and how does it change over space?

Perhaps my interest in scale has been rendered a historical artifact, replaced by the ubiquitous use of “Zoom Level?” I presume you have heard modern “zoom level” conversations, as in, “These two towns are about an inch apart on my screen at zoom level 17. How far apart are they at zoom level 12? I don’t know, I don’t use Bing, I use Apple Maps and my screen has more pixels per inch than yours. Is that important?”

Why does this matter?

Without further belaboring the numerous problems with today’s most common mapping systems, it is important to note that online mapping is about to take a significant turn from street maps and simple navigation towards GIS and what might be called spatial inquiry systems. Users will benefit from a move beyond street maps to geographical inference engines that can answer user questions in a highly targeted spatial manner. However, much of the promise of these types of systems is based on understanding spatial data and methods used to represent it. In the next few blogs I will discuss where I think this evolution will take us in the online world of mapping and how we might get there by solving some interesting problems. However, I will likely mix in a few product reviews along the way, as there are a number of companies claiming some remarkable, but unlikely, potentials.

Until next time –

Best,

Dr. Mike

Bookmark and Share

Posted in Apple, Bing maps, Geospatial, Mapping, Microsoft, Nokia, map compilation | 2 Comments »

Waze-Crazy – Would Facebook Drop a Billion on Waze?

May 9th, 2013 by admin

As often happens lately, I had no intention of writing about anything in the news due to lack of interest. However, Marc Prolieau wrote an article this morning on the rumor that Facebook would pay one billion dollars for Waze, and then wrote me to ask my thoughts. I, then, saw an article berating Google for not being the company that was buying Waze. It was at that point that I began thinking that I must have missed some major development at Waze. In turn, this idea prompted me to do some research and write a commentary on the potential acquisition.

In the spirit of openness, I was contacted by someone representing themselves as Facebook’s “location” guy shortly after my blog about the problems associated with the release of Apple Maps in 2012. We never connected. So, I do not have any contacts at Facebook, nor do I have any contacts at Waze with whom I am in communication. Also, in the spirit of openness, I thought about titling this essay “Startup Crowd-Sources Corporate Value.” So, let’s get going.

Waze describes itself as follows:

“After typing in their destination address, users just drive with the app open on their phone to passively contribute traffic and other road data, but they can also take a more active role by sharing road reports on accidents, police traps, or any other hazards along the way, helping to give other users in the area a ‘heads-up’ about what’s to come.”

“In addition to the local communities of drivers using the app, Waze is also home to an active community of online map editors who ensure that the data in their areas is as up-to-date as possible.”

At the end is a video, which can be linked to from the above referenced About Us page on the Waze website. The video ends with a note to this effect – “Keep in mind that Waze is a driver-generated service. Just, as other user-generated projects, it depends on user participation. Your patience and participation are essential.”

I don’t know about you, but if Waze is going to pick up a billion bucks based on my labor, I’d want more than a note indicating that my participation and patience were essential to their success. However, the more interesting question is whether or not Waze is worth $1,000,000,000.00.

To get my arms around valuing Waze I decided to go through a brief acquisition checklist

What is it that is worth a billion dollars at Waze?

Brand? No.

Waze is minor brand that remains generally unknown around the world. I think it might be difficult to put a high valuation on a company whose product is crowd-sourced and whose brand represents the industrious endeavors and lacks of its audience. Note that use of “lacks” here does not indicate that these people are dolts, rather that the user profile is likely not uniform (standardized) or distributed uniformly across geographical space. In turn, this suggests that the product is not uniformly accurate across that same space. As a consequence, the brand’s value may exhibit a significant spatial and temporal variation.

Distribution/Users ? No.

Wikipedia claims that Waze was downloaded 12 million times worldwide by January 2012 and 20 million times by July 2012. By the end of 2012, according to Waze, that number had increased to 36 million drivers. Today, there are apparently 44 million users. To be honest, I am not sure how to parse the information on downloads. Downloads do not indicate active users. The notion of downloads, also, does not indicate geographic coverage or the ability to harvest GPS traces or active crowdsourced updates in specific geographies.

Next, I am not sure how Waze measures users, nor was I able to find any definitive source for this information. I doubt that it has 44 million active users. An article in the Huffington Post indicates that Berg Insight, a Swedish market research firm, says Waze has from 12 million to 13 million monthly active users. If Berg Insight is correct, then the Waze contributors are likely spread thin on a worldwide basis and likely concentrated in a modest number of large urban areas. In addition, how long the active users of Waze have been contributing GPS traces or active updates would appear to be time limited based on the reported number of users and the growth of the company.

So distribution remains unknown, except, perhaps, to Waze. However, even if they could validate the number of reliable active users, it remains unclear how those users are distributed across the geographical space of Waze’s target markets.

Finally, another problem is the type of driving usually performed by Waze users. Are the majority of the miles traced those showing a repetitive journey to work five days a week? I suspect this is a large portion of the tracks they receive. If this is true, then their distribution is likely quite limited in terms of the uniformity and the extent of geographic coverage.

Intellectual Property, Trade Secrets, Know-how? No.

Waze has 100 employees. I am sure that they are bright, energetic and extremely capable. I doubt that what they may know, have codified, filed as patent applications or hold as trade secrets is worth anything near a billion dollars. After all, it is not that other people are ignorant on the topic of how to build a crowdsourced mapping system.

Map Database? No.

Waze claims that in 2012 its users made 500 million map edits and upgraded the map to reflect 1.7 million changes on-the-ground that took place in 110 countries with community-edited maps. Ok, just what does this stuff really mean?

Updates may merely reflect the poor quality of a map base or even the lack of a map base available to Waze for its customers use. The number of countries involved does not necessarily indicate that the company has complete, up-to-date coverage in any of these countries. More problematically, I suspect that Waze has no objective method of assessing the accuracy of its maps compared to other sources. For those of you who need a short primer on Spatial Data Quality, see my blog on the Apple Maps fiasco, as this is the reason they got a failing grade on their product roll out.

Again, the issue here is how many users have been contributing GPS traces and active edits and over what period of time. It appears to me that the time horizon of Waze is too short to have created a map database of considerable value.

Other Assets (intangibles)? No.

Waze has some uniquely capable people and assets, but, for me, they do not tip the scales at a billion dollars.

Is the whole worth more than the sum of the parts? No.

I just can’t get to the billion dollar number no matter how I combine the basic facts. I have read the articles indicating that Facebook needs its own map base so it can customize it for mobile advertising, or that it needs its own map database in order to compete in the mobile location market. I suppose a company can convince itself of anything and Facebook may have crossed the chasm based on these types of assumption. If so, I think they are wandering in a labyrinth of strategic blunders.

Yes, they could wind-up with their own map database, but I suspect that with this purchase will from day one be a headache in terms of spatial data quality. Facebook will spend more money fixing and tuning the Waze database than if they had licensed a database from Nokia or TomTom or from a collection of companies, as has Apple. In turn, the adoption of their “mapping product” by the market might be significantly delayed.

The more serious issue is that dealing with the quality of the Waze database and integrating the database with other Facebook applications will subtract cycles from their efforts in areas that are core to building a successful Facebook mobile business. In the end, Facebook will come down with a serious case of buyer’s remorse, as they will eventually ask the question “Why wasn’t anyone else willing to pay a billion dollars for Waze?

In a final check of the Waze site tonight I noticed that the Waze homepage (http://www.waze.com/) redirects to http://world.waze.com/?redirect=1 , which is a complete and absolute blank. Perhaps the deal is done. Or, it might simply be a map tribute to Lewis Carroll.

Best,

Mike

Bookmark and Share

Posted in Apple, Facebook, Google maps, Local Search, Mapping, Nokia, TomTom, User Generated Content, Volunteered Geographic Information, Waze, crowdsourced map data, map compilation, map updating | 5 Comments »

Unintended Consequences – The Roles of Google and Crowdsourcing in GIS

March 18th, 2013 by admin

The following blog is a concise, non-illustrated version of a keynote address I gave at the North Caroline GIS conference last month in Raleigh, NC.

There is little doubt that Google has created an incredibly successful mapping product, but it is at this point that the law of unintended consequences may occur and diminish not only the success of Google Maps, but also hinder mapping and GIS in the wider world.

Let’s start by looking at what I mean by “unintended consequences.” In simple terms an unintended consequence is not a purposive action, it is an outcome. Outcomes can be positive, such as a windfall. Outcomes can be negative, such as a detriment. Or, results can be perverse, in which case the outcome is contrary to what was expected. My focus in this blog is on the negative outcomes, although some may typify them as a case of the glass half-full.

The romantic notion that cartographers wandered the world with charts and map tables so they could compile map data as they explored is the stuff of history. For countless decades map publishers have created map manuscripts by compiling data collected from sources that were considered authoritative and it is this model that Google had adopted. From a practical perspective, it is impossible for any single company to map the entire world at a street and road level without the help of contributors from the public sector, private sector and ordinary citizen scientists interested in maps, geography and transportation.

It is my belief that Google, due to the success of its search engine and the pursuit of its corporate mission “…to organize the world’s information and make it universally accessible and useful”, has been unusually successful in convincing authoritative sources around the world to allow Google to use their data in its mapping products. In some cases this has involved licensing or use agreements, and Google has advantaged itself by integrating data from sources that it considers the “best of breed” to enhance its products.

Most of these “trusted” sources are “official sources”, such as the USGS, the Bureau of the Census and other governmental agencies at all levels of administration from around the world. In areas where Google has been unable to reach agreement to use specific data, or in those locations where “trusted” data does not exist, it has relied on its own industrious endeavors to compile these data, although it has been helped tremendously by crowdsourcing.

It is clear to me that Google turned to licensing and crowdsourcing to remedy the unpalatable variations in the levels of spatial data quality in the map data that were supplied to it in the years when Google Maps was primarily based on data licensed from Tele Atlas (now TomTom) and Navteq (now Nokia). It appears that Google’s transition to able compilers of navigation quality map databases has been quite successful. However, I wonder if this success is not unlike the magnificent willow tree with a tremendous girth and abundant leaves on massive flowing branches, but slowly dying of decay from the inside.

Google’s move into GIS by providing the power of the GoogleBase as a GIS engine is an attractive notion to many organizations and for good reason. However, people who are responsible for funding budgets in these organizations (such as legislators) are beginning to ask these overly simplified questions: “Why are we paying people to do this mapping stuff when Google is giving it away from free? “Can’t we just use their data?” I am sure you are all thinking, “Nobody could be that shortsighted.” I guess you have not spent much time with politicians.

Recent events have led me to conclude that Google has now realized this very flaw in its approach to mapping. Did any of you think it was unusual that Google released two different strategic studies recently showing the economic benefits of Geospatial data (see 1 at end of blog). You know, Google is always releasing its strategic studies. Why the last one I can remember was in …..hmmmm?

In a study commissioned by Google and carried out by the Boston Consulting Group, it was indicated that the U.S. Geospatial industry generated $73 billion in revenues in 2011, comprised 500,000 jobs and throughout the greater U.S. economy helped drive an additional $1.6 trillion in revenue and $1.4 trillion in cost savings during the same period.

A second study by Oxera was equally interesting and focused on the direct effects, positive externalities, consumer effects and wider economic effects, including the gross value added effect of “Geo services.” One section of this report that caught my eye was a discussion (page 15) of Geo services as an intermediate good – one that is not normally valuable in themselves, “…but help consumers engage in other activities.” When discussing the “economic characteristics of Geo” the Oxera report indicates (page 5) that, “This question is relevant because it has implications for the rationale for public funding of certain parts of the value chain and for the market structure of other parts.”

Neither of the released reports (at least in the form they were published) mentions Google, its mapping business or how these studies should be viewed by the Google-ettes.

While Google may have had many reasons for funding these two reports, I think that the “law of unintended consequences” is rearing its head in Google land. If the public/governmental sources that provide data to Google through license can no longer afford to produce the data because their funding sources thinks that collecting and presenting map data is something that can be handled better in the private sector (such as Google is doing), the data underpinnings of the geospatial world will start to collapse. Yes, I know that Google does not do what its licensors do with spatial data, but have you seen the decision tree of a politician who really understands the complexities of GIS and mapping, why they cost so much, take so long and can’t be shared through the enterprise?

OK – let’s turn to crowdsourcing. While Google did not invent crowdsourcing, it certainly knows how to use it to its advantage. Now that its users are willing to compile and correct the Google Mapbase for free, how will anyone else in the business make money compiling data using a professional team of map data compilers? The economics weigh against it and it may be a practice whose time has come and gone. The reasons for this are, in their entirety, more complex than I have described. However, without developing the argument more in this essay, I will simply skip ahead to my conclusion, which is that professional map database compilers are an endangered species. It is likely that their “retirement” will not be noticed – at least not until crowdsourcing falls out of vogue, as it will, when people begin wondering why Google cannot afford to keep its own damn maps up-to-date.

As all of you know, maps are near and dear to my heart. The problem of unintended consequences in regards to Google and crowdsourcing to GIS and mapping are nearly as worrisome to me as the planet-wide loss of electricity. I’m going to squirrel away a cache of paper maps, just in case. Laugh if you want, but when you need to buy one from me you will begin to understand the meaning of monopoly, as well as to really appreciate the concept of unintended consequences.

1. Links to both studies can be found in this article at Directions Magazine

http://apb.directionsmag.com/entry/google-shares-oxeras-report-on-impact-of-geospatial-services-on-the-wo/306916

Bookmark and Share

Posted in Authority and mapping, Data Sources, Geospatial, Google, Google Map Maker, Google maps, Mapping, Mike Dobson, Navteq, Nokia, Tele Atlas, TeleAtlas, TomTom, crowdsourced map data, map compilation, map updating | 2 Comments »

Google Maps announces a 400 year advantage over Apple Maps

September 20th, 2012 by admin

UPDATE September 24, 2012: The Comment Cycle for the following entry is now closed. Thanks to everyone who has contributed.

I had a call from Marc Prioleau of Prioleau Advisors this morning and speaking with him prompted me to look into the uproar over Apple’s problems with its new mapping application. So, this column is Marc’s fault. Send any criticisms to him (just kidding). While you are at it, blame Duane Marble who sent me several articles on Apple’s mapping problems from sources around the world.

In my June blog on Apple and Mapping , I postulated that the company would find building a high quality mapping application very difficult to accomplish. Among the points I made were these:

• However, it is not (mapping) San Francisco that will give Apple heartburn. Providing quality map coverage over the rest of the world is another matter completely.

• Currently Apple lacks the resources to provide the majority of geospatial and POI data required for its application.

• My overall view of the companies that it (Apple) has assembled to create its application is that they are, as a whole, rated “C-grade” suppliers.

• Apple seems to plan on using business listing data from Acxiom and Localeze (a division of Neustar), supplemented by reviews from Yelp. I suspect that Apple does not yet understand what a headache it will be to integrate the information from these three disparate sources.

• While Apple is not generating any new problems by trying to fuse business listings data, they have stumbled into a problem that suffers from different approaches to localization, lack of postal address standards, lack of location address standards and general incompetence in rationalizing data sources.

• Apple lacks the ability to mine vast amounts of local search data, as Google was able to do when it started its mapping project.

Unfortunately for Apple, all of these cautions appear to have come true. So much for the past.

In this blog, after setting the scene, I will suggest what Apple needs to do to remedy the problems of their mapping service.

Given the rage being shown by IOS 6 users, Apple failed to hurdle the bar that was in front of them. I have spent several hours poring over the news for examples of the types of failures and find nothing unexpected in the results. Apple does not have a core competency in mapping and has not yet assembled the sizable, capable team that they will eventually need if they are determined to produce their own mapping/navigation/local search application.

Perhaps the most egregious error is that Apple’s team relied on quality control by algorithm and not a process partially vetted by informed human analysis. You cannot read about the errors in Apple Maps without realizing that these maps were being visually examined and used for the first time by Apple’s customers and not by Apple’s QC teams. If Apple thought that the results were going to be any different than they are, I would be surprised. Of course, hubris is a powerful emotion.

If you go back over this blog and follow my recounting of the history of Google’s attempts at developing a quality mapping service, you will notice that they initially tried to automate the entire process and failed miserably, as has Apple. Google learned that you cannot take the human out of the equation. While the mathematics of mapping appear relatively straight forward, I can assure you that if you take the informed human observer who possesses local and cartographic knowledge out of the equation that you will produce exactly what Apple has produced – A failed system.

The issue plaguing Apple Maps is not mathematics or algorithms, it is data quality and there can be little doubt about the types of errors that are plaguing the system. What is happening to Apple is that their users are measuring data quality. Users look for familiar places they know on maps and use these as methods of orienting themselves, as well as for testing the goodness of maps. They compare maps with reality to determine their location. They query local businesses to provide local services. When these actions fail, the map has failed and this is the source of Apple’s most significant problems. Apple’s maps are incomplete, illogical, positionally erroneous, out of date, and suffer from thematic inaccuracies.

Perhaps Apple is not aware that data quality is a topic that professional map makers and GIS professionals know a lot about. In more formal terms, the problems that Apple is facing are these:

Completeness – Features are absent and some features that are included seem to have erroneous attributes and relationships. I suspect that as the reporting goes on, we will find they Apple has not only omissions in their data, but also errors of commission where the same feature is represented more than once (usually due to duplication by multiple data providers).

Logical Consistency – the degree of adherence to logical rules of data structure, attribution and relationships. There are a number of sins included here, but the ones that appear to be most vexing to Apple are compliance to the rules of conceptual schema and the correctness of the topological characteristics of a data set. An example of this could be having a store’s name, street number and street name correct, but mapping it in the wrong place (town).

Positional Accuracy – is considered the closeness of a coordinate value to values accepted as being true.

Temporal Accuracy – particularly in respect to temporal validity – are the features that they map still in existence today?

Thematic Accuracy – particularly in respect to non-quantitative attribute correctness and classification correctness.

When you build your own mapping and POI databases from the ground up (so to speak), you attempt to set rules for your data structure that enforce the elements of data quality described above. When you assemble a mapping and POI database from suppliers who operate with markedly different data models, it is unwise to assume that simple measures of homogenization will remedy the problems with disparate data. Apple’s data team seems to have munged together data from a large set of sources and assumed that somehow they would magically “fit”. Sorry, but that often does not happen in the world of cartography. Poor Apple has no one to blame but themselves.

Recommendations

1. Unfortunately for Apple, they need to take a step back and re-engineer their approach to data fusion and mapping in general.

2. I suspect that the data and routing functionality that they have from TomTom, while not the best, is simply not the source of their problems. Their problem is that they thought they did not have a problem. From my perspective, this is the mark of an organization that does not have the experience or know-how to manage a large-scale mapping project. Apple needs to hire some experts in mapping and people who are experienced in mapping and understand the problems that can and do occur when compiling complex spatial databases designed for mapping, navigation and local search.

3. Apple does not have enough qualified people to fix this problem and needs to hire a considerable number of talented people who have the right credentials. They, also, need to develop a QA/QC team experience in spatial data. They could establish a team in Bangalore and steal workers from Google, but if they want to win, they need to take a different approach, because this is where Google can be beaten.

4. Apple appears not to have the experience in management to control the outcome of their development efforts. They need to hire someone who knows mapping, management and how to build winning teams.

5. Apple needs to get active in crowdsourcing. They must find a way to harness local knowledge and invite their users to supply local information, or at least lead them to the local knowledge that is relevant. This could be accomplished by setting up a service similar to Google Map Maker. However, it could also be accomplished by buying TomTom, and using its MapShare service as part of the mapping application to improve the quality of data. I think something like Map Share would appeal to the Apple user community.

6. Speaking of acquisitions, Apple could buy one of a number of small companies that integrate mapping and search services into applications for use by telephone carriers. The best of these, Telmap, was snapped up by Intel earlier this year, but other companies might be able to do the job. Perhaps Telenav? Hey, here is an interesting idea – how about ALK, now being run by Barry Glick who founded MapQuest?

7. I suppose Apple will want to hire Bain or some other high power consulting group to solve this problem. That would be the biggest mistake they have made yet, but it is one that big business seems to make over and over. As an alternative, I suggest that Apple look to people who actually know something about these applications.

Conclusions

There is no really quick fix for Apple’s problems in this area, but this should not be news to anyone who is familiar with mapping and the large scale integration of data that has a spatial component.

Of course there appears nowhere to go but up for Apple in mapping. I wish them the greatest of success and suggest that they review this blog for numerous topics that will be of assistance to them.

If you want to know more about map data quality see ISO (International Organization of Standardization), Technical Committee 211. 2002. ISO 19113, Geographic Information – Quality principles. Geneva, Switzerland: ISO. Available online from http://www.isotc211.org/

And, I urge Apple to keep a sense of humor about these problems, as have some of its users. I had a great laugh at a comment about Apple’s mistaking a farm in Ireland as an airport. The comment was “Not only did #Apple give us #iOS6… They also gave us a new airport off the Upper Kilmacud Road! Yay!

Until next time.

UPDATE on September 24, 2012. I have closed the comments period for the Apple Maps Blog. Thanks to all who have contributed.

Mike

Bookmark and Share

Posted in Apple, Authority and mapping, Data Sources, Geotargeting, Google Map Maker, Google maps, MapQuest, Mapping, Personal Navigation, TomTom, crowdsourced map data, map compilation, map updating | 106 Comments »

The Atlantic Magazine Reveals How Google Builds its Maps – Not.

September 19th, 2012 by admin

At last! We are close to delivering our final report to the US Census Bureau on their Master Address File and I now have time to devote to one of my favorite pastimes – writing blogs for Exploring Local. Hooray. What this means in consultant speak is that I am “on the beach” or between assignments, although truth be told, I am not looking very hard for anything to do for a while. I have my personal projects all laid out.

Over the last few weeks I have read a number of articles about maps and mapping that have renewed my interest in what’s going on in the profession. I guess that is something of a misstatement, since there a no longer enough people who actually are cartographers to make a profession, at least one that has any hopes of future growth. However, not to worry. Alexis C. Madrigal, a noted madrigal singer, oops I meant a the senior editor at The Atlantic, where he oversees the Technology Channel, recently wrote a marketing piece, oops I meant an “even-handed” review of the encyclopedia of all cartographic wisdom – Google Maps. His article “How Google Builds Its Maps – and What It Means for the Future of Everything” is a monumental tribute to revisionist history.

I suspect that The Atlantic magazine will soon be renamed “The Atlantic & Pacific”, since Googleland appears to be the heart of the cartographic universe. While reading the article I thought “Wow, he gets paid for writing this crap” and as I read further, I began to wonder “But who pays him?” The entire read resembled a poorly thought out advertorial.

I guess Apple’s entry into the mapping arms race has the big “Goog” upset and they decided to get ahead of the curve by bringing in a cub reporter, who knew little about mapping, to whom they could show why Apple doesn’t have a chance. Why Googs did not make these stunning revelations to a writer from a real tech magazine is an interesting question, but one to which we all know the answer. Madrigal’s article was enough to make me want to change my opinion of the problems that Apple must overcome to be a player in this venue. Well, let’s start down that road by focusing on the startling new truths that Google revealed to Mr. Madrigal about the world of mapping.

I know that it may be hard for some of you to realize that mapping was not discovered by Google. Last February I was examining a floor-mosaic–map in Madaba Jordan that dated from the 6th century AD that was designed to show pilgrims the layout of the Holy Lands. I can assure you that it did not have the Google’s copyright notice anywhere on it and I can, also, assure that it was not particularly old as maps go.

Dating from the 6th Century AD mosaic-map was designed to provide an overview of the Holy Lands to those on pilgrimages.

It may come as a further shock to you that Google did not invent map projections, including the one they use, nor did they invent tiling, symbolization, generalization, decluttering, zooming, panning, layers, levels, satellite imagery, map tiling, crowd-sourcing (both active and passive), their uniquely named “Atlas tool”, as well as most everything else associated with Google Maps. Even Google’s Street View had its origins in robotic sensing systems developed to enhance computer vision research, although Google, with the help of Sebastian Thrun, was smart enough to figure out how it could give them a competitive advantage in compiling street maps.

Where to start on the Madrigal article? How about the byline that reads “An exclusive look inside Ground Truth (no, Google did not invent this either), the secretive program to build the world’s best accurate map” Phew, I was glad to learn that Google was not attempting to build the world’s worst accurate map. I had hoped that the Googie was going to attempt to build the world’s most accurate map, but I guess that they just wanted the world to know that what they were building would be “bester” than whatever Apple could supply.

Did you notice the secret information the former NASA rocket scientist who was mentioned in the article told Madrigal? It was a howler, as a matter of fact it sounded like something I wrote in a blog when I was being snarky about Google. Anway, here is the exclusive/secret info from the horse’s mouth that was revealed exclusively the Madrigal of The Atlantic

“So you want to make a map,” Weiss-Malik tells me as we sit down in front of a massive monitor. “There are a couple of steps. You acquire data through partners. You do a bunch of engineering on that data to get it into the right format and conflate it with other sources of data, and then you do a bunch of operations, which is what this tool is about, to hand massage the data. And out the other end pops something that is higher quality than the sum of its parts.”

Wow was that informative. Before we go any further, I would like to note that Mr. Madrigal might have received a better education on Google Maps by reading this blog than visiting Google, but then that would be shameless self-promotion. So, will you tell him instead?

For some reason the opening figure in my copy of the article is a photo of two people playing ping pong. That had me stumped for a while, as I could not figure out what it had to do with project “Ground Truth”. Well, it still has me stumped, but I am working on it. I thought we might get a photo of a person at a workstation with a map on a monitor and details of the environment at the Google Maps office in Mountain View, CA. Apparently ping-pong is more interesting and newsworthy and the great Googs was not about to reveal any detailed information about how they compile maps to Mr. Madrigal.

I was more than mildly surprised that Madrigal seemed not to understand that there was more to Google Maps, or any map for the matter, than meets the eye. How did he think that routing occurs? Did he really believe the “Men in Black” idea that there were tiny aliens inside Google servers that supplied routes on demand as they were requested? Does he know that computerized routing has been around on a commercial basis since the early 1970s? Did he ever hear of Barry Glick, the founder of MapQuest, hawking online routing capabilities before Google was founded? Does he have any idea what NAVTEQ does with its highly instrumented vans and imaging systems? Has he ever looked at Bing Maps, or the hundreds of other services out there that provide competition to Google in the mapping sector? Put more simply, did the author of this article have the least little bit of inquisitiveness about what Google was telling him? My conclusion is a big, “Nope.”

I was, also, stunned to read Manic (okay, that’s supposed to be Manik) Gupta’s comment that the information in offline real word in which we live is not entirely online. When did this happen? Why is it allowed? Wow, this is beginning to sound like a science fiction thriller where there is no distinction between offline and online. Maybe Google really is an earth changing company, in more ways than we realize. Hopefully Tom Cruise will play the part of Gupta in the movie version of this thriller.

Gupta’s follow-up quote was even better – “Increasingly as we go about our lives, we are trying to bridge that gap between what we see in the real world and [the online world] and Maps really plays that part.” Hmmm. I had always thought that maps were a representation of the real world, and not the original thing. Based on the Madrigal article, it appears that he thinks that maps can and should serve as the real world. I guess Mr. Madrigal may not understand the real nature of project “Ground Truth”, or the use-warnings Google puts on their map and routes. I don’t know about you, but I have heard that trusting ground truth is usually a better strategy than ignoring it and trusting its representation, whether that representation is online, printed, in a box on your dash, in your phone, hiding in the ether while encoded in a radio wave, or packed as a new innovation labeled “Ground Truth” created by Google (and thousands of people before them).

I smiled when I read that Madrigal was stunned to learn that humans were involved in Google’s map making process. Yes. Humans are apparently needed to remedy problems that software cannot seem to solve. Imagine, data compiled from various sources that does not quite fit. Is that possible? Hmmm. Did Google invent that too? And is using crowd-sourcing to mine knowledge another Google innovation? No, I don’t think so. Is there no end to Madrigal’s naiveté? Well, the answer to that also appears to be “no.”

I hope you noticed his comment that, “This is an operation that promotes perfectionism.” I, also, liked this one “The sheer amount of human effort that goes into Google Maps is just mind boggling.” Followed by this, “I came away convinced that the Geographic Data that Google has assembled is not likely to be matched by any other company.” Well, guys, apparently it’s time to give up on mapping. Google, according to Madrigal, appears to have thought every thought about mapping that could ever be thought. Well, maybe not.

I hope you noticed the section of the article with this comment, “The most telling moment for me came when we looked at couple of the several thousand user reports of problems with Google Maps that come in every day.” A couple of thousand error reports every day? Is that like saying Google Maps only has 347 million known errors left to remedy? Seems that just like most of us Google will not be the first to achieve perfection in mapping. If you read my series of about Google’s map correction program, you know more about this than Mr. Madrigal, so you should consider applying for his position at The Atlantic.

I wonder why there appeared to be only one workstation for Mr. Madrigal to observe in Mountain View? According to Madrigal’s article hundreds of operators are required to map a country and many of them are in Google’s Bangalore office. Hmmm, So much for local knowledge. In part this remoteness of operators from the geography they are editing is why there are so many errors on Google Maps. In addition, maybe all those Google-driven innovations in mapping don’t quite help when the sources that Google harvests contain incorrect information to begin with. Adding erroneous information the edit queue of an operator who must use other surrogate information to validate it can be a recipe for disaster, as Google has proven time and time again.

I do applaud Mr. Madrigal for realizing that Street View is an important distinction in the marketplace for mapping services. Whether Google is actually using Street View for all of the processes mentioned in the article is unclear to me. Didn’t it sound like something a VP would say when Brian McClendon indicated that, “…… We’re able to identify and make a semantic understanding of all the pixels we’ve acquired.”? Wow, that’s great, but do you think they really do this? Someone should send McClendon some articles to read on image processing, as well as some older texts on information theory – seems as if they are doing a lot of work they do not need to do. And how about the number of businesses, addresses and logos they have collected with Street View. If only they could create a comprehensive address database with this stuff, but they can’t because of footprint limitations related to the deployment of their Street View assets. However, whether Street View provides a sustainable competitive advantage is something that Apple, Microsoft, NAVTEQ and TomTom will have to decide. It may a competitive advantage today, but I can assure you that whether or not it is sustainable, will not depend on Google’s wants.

So to Apple and all the apparently second level mapping companies – Don’t give up the map race quite yet. The fact the Google thinks you can’t catch them may be the best news you have had this year.

Finally, shame on Google for participating in a public relations effort masquerading as a report on technological innovation. While I have great respect for what Google has achieved with Google Maps, the interview behind the Madrigal article was not designed to reveal any details on Google’s technological innovations in mapping. Instead, it was an interview strategically planned to denigrate Apple’s mapping capabilities by implying that it could not compete with the great Googie. Revealing old news to someone who did not have a background in mapping, GIS or navigation is pandering and something I had not expected from Google. Just what is it about Apple’s mapping program that has them so scared? Hmmm. Something to think about.

Bookmark and Share

Posted in Apple, Authority and mapping, Data Sources, Google, Google Map Maker, Google maps, Navteq, TomTom, User Generated Content, routing and navigation | 4 Comments »

« Previous Entries