Google Grabs More Geo-Data

By letting developers store data on its servers, Google hopes to make the geo-Web more searchable.

May 22, 2009

On Wednesday, Google announced that it would open its servers to geographic data belonging to anyone. This means that developers will be able to quickly build a location-based Web service without having to also manage their own data server. The announcement might be good for many developers, but it’s also good for Google itself: the location data will be integrated into Google’s search index, making it searchable and, ultimately, capable of generating advertising revenue.

At the Where 2.0 conference in San Jose, CA, Google senior product manager, Lior Ron, introduced an application programming interface (API) for Google Maps Data. Ron explained that the product allows developers to “store custom geographic information on Google’s infrastructure, relieving developers of the duty of maintaining data on an infrastructure.”

In 2005, Google released an API for its Google Maps services, providing a simple way to create a custom map for a website or build “mashups” using other data sources such as Craigslist rental listings. Now, Google Maps Data API, lets developers dive deeper into the map-making process, using Google’s servers to store and manage the geographical data an application or service might need.

One company that’s already using the Google Maps Data API is Platial, an online atlas that lets users tag and share places of interest. Currently, Platial holds all of this user-generated geo-data on its own servers. With the new API, the data will be integrated into Google’s index, says Jake Olsen, Platial’s chief technology officer, providing “immediate discovery via Google Search in Maps.”

Another Maps Data project, called My Tracks, was created by Google’s own engineers. My Tracks runs on Android mobile phones and records the location of GPS “breadcrumbs” left by users as they walk, jog, or bike outdoors. The data is also stored on Google’s servers, and people can edit and share information via Google Maps.

As real-time geographical data enters Google’s database, it could provide the search engine with a hook into real-time search, says Olsen. Currently, the company indexes the Web on a rolling basis throughout the day and updates a few services, like Google News, more frequently. A real-time stream of geographical data could help Google index certain types of events more quickly. The search engine could, for instance, collect users’ pictures geo-tagged at the scene of a crime or accident and include them in a news search.

One increasing concern, however, is that Google might gain access to much more geographical data than other services. This could lock out some smaller companies, and some developers might be wary about completely trusting Google with all of their data.

Another drawback for developers may be that currently the API has some service constraints. “If you run a popular site, you will hit limits,” says Olsen, “whether it be rate limits or content limits.” He suspects that Google will come up with a commercial licensing scheme and find a service-level agreement that makes its capabilities clearer.

In the meantime, Platial isn’t giving up its databases completely. “As with any cloud-based service,” says Olsen, “if that service goes down, so do you.”