Print This Post

Google Earth Enterprise Bites the Dust

In 2005, Google shook up the GIS world when it introduced Google Earth. Its snappy, easy-to-use interface and global data coverage were very attractive, and people all over the world wanted a private Google Earth globe to exploit their own data. Google responded with Google Earth Enterprise (GEE), and companies like Galdos became GEE partners, able to install, populate private globes, and build GEE-based applications. Thousands of such private globes were installed around the world. Now Google has decided to wind down GEE. This does not mean, however, that Google is giving up its commitment to its flagship products, or that Google Earth and Google Maps (and all Google’s data) are going away. Geospatial data is key to too many of Google activities and customers for that to happen. Nonetheless, this is a major change in the geospatial market and, as always, this means a time to broadly rethink geospatial data management.

Some have argued that the demise of Google Earth Enterprise offers Google a path to strengthening its data supply chain for Google Maps and Google Earth. Many Google Earth Enterprise customers will likely migrate to ESRI ArcGIS Server and, since ESRI has control of the majority of Municipal databases around the world, this should provide Google with access to more accurate and timely data. There are a few things wrong with this argument, however. In the first place, there are many more existing ArcGIS Servers than GEE licenses and although these could be feeding Google Earth and Google Maps now, they are not — at least not in terms of automatically updating the information as things change at servers in the wild. Adding a few more ArcGIS Servers converted from GEE will not make this any different. And in the second place, even when these were GEE installations, there was no automatic pumping of data from GEE to Google Earth. It is indeed time for Google to consider its data supply chain infrastructure, but this is the case with or without GEE.

Google has as its mission to “manage the world’s information” and it is interesting in this context to compare how the handling of geospatial information differs from that for textual information. When you search using Google, the result page provides a textual summary and a link to the server that it summarizes. The actual data resides at the server and not at Google. This is, of course, NOT how Google Earth or Google Maps work. In a rethink of the global supply chain at Google, maybe this should be changed.

Let us assume that there are a large number of spatial data servers, each storing geographic content for one or more parts of the Earth. These servers can be owned by many different organizations and have nothing to do with one another. They can be existing servers, like ArcGIS Server or GeoServer, or ones converted from old GEE customers. Let us further assume that these servers support some sort of standard request protocol such as that of the Web Feature Service (WFS) from the Open Geospatial Consortium (OGC), which is currently extending with a RESTFUL encoding. WFS is supported by ESRI ArcGIS Server, GeoServer, and many others, and enables the requested data to be returned in a standard encoding such as OGC GML or KML. This is no different than their being HTTP servers and HTML in the textual web.

Then we can add a modern Registry Service that supports OpenLayers, such as Galdos INdicio, that acquires the spatial summary information (above) and builds a spatial index of the content. Users can then make their requests using the same format as a conventional Google search, combining free text, taxonomic elements, and object properties and relationships. The requests would be directed to the Registry (Search Engine), which would then return the associated spatial (and non-spatial) content.

An infrastructure such as this could be employed today with existing OGC compliant WFS servers, whether for a specific project or for the entire world. And it would be highly scalable, since it uses a possibly global community of servers to provide the geographic data. Use of XML namespaces ensures that features can be globally identified and discriminated.

The above approach puts more emphasis on the servers out in the wild hence, like the rest of the web, performance will vary with the sites being contacted. A high performance background map/image server such as Google Maps and Google Earth (or Cesium, an open source alternative) would continue to be part of the equation. This approach would also provide a global management of spatial information that is more heterogeneous than is the case with Google Earth or Google Maps today, but also one that can provide richer content and which is more tightly integrated with the greater web.

Of course, the provision of the background maps and images will still demand extensive holdings of data at Google, and a supply chain to support it. I will explore the pub-sub infrastructure necessary to maintain this global background in a following article.

See the related post in News — INdicio as a Google Earth Enterprise Replacement