The rapid growth in data from geospatial sources means that new interoperability standards are going to be needed to analyse it. The OGC wants your input to decide what those standards should be
Help us deal with Big Earth data
According to some estimates, approximately 80% of an organisation’s data has a location component. The volume of new location enabled data is increasing, with data feeds from sensors and GPS in mobile devices, Internet-connected sensors in smart buildings, and video surveillance. In almost all cases, these are observations from sensors. For example, the always-on Internet of Things (IoT) includes the seven billion mobile devices with embedded position determination (GPS chips) and indoor location technologies such as beacons, Wi-Fi location determination, and hybrid indoor-positioning methods. In addition, the increasing number of airborne and satellite-borne imaging sensors are producing remotely sensed data at ever-increasing resolution in more spectral bands. The growth of sensor applications combined with mobile phone usage means there is and will be data almost everywhere about almost everything. Almost all of this data is geospatial.
For the data to be of value requires means of discovery, assessment, aggregation and access as well as proper storage, processing, and analysis. Modern cutting-edge uses of data involve analysis via advanced statistical modelling and predictive analytics. As available data and computational power increase, so does the potential precision and value of the analytics. Data scientists must understand the proper use and modelling of geospatial data to ensure proper analytic outcomes.
The importance of standards
Standards enable sharing and reuse of data. Sharing and reuse reduce the cost of developing and maintaining both data and software. In general, data’s value increases when it is more widely used. The IoT is expanding in these three categories of use:
- Machine-to-machine (M2M) communication.
- People-to-people sharing via social networks.
- People-to-machine connections.
To cope with the deluge of Big Data arising from these communications, the industry is growing the API economy to simplify interoperability. However, without open standards the proprietary and often closed APIs limit interoperability and thus limit market size and app value.
The OGC recently approved formation of a Big Data Domain Working Group (DWG) to provide an open forum for discussion of topics involving geospatial interoperability, access, and analytics in Big Data applications. The group will foster collaborative development among participants representing many organisations and communities, and will ensure appropriate liaisons to other relevant working groups inside and outside the OGC. The DWG will provide a forum to clarify some foundational terminologies in the context of data analytics, elucidating the differences and overlaps between terms, while introducing the term “Big Earth Data” to describe data with geospatial references.
The Big Data DWG seeks to answer these questions:
*What does Big Earth Data mean in an OGC context?
- What are the challenges, if any, of Big Earth Data for the OGC’s data and service interface specifications?
- What is the market value of Big Earth Data and how can the OGC support leveraging it?
After public and OGC member feedback, a report will be submitted to the OGC membership for publication as an OGC best practice paper.
Peter Baumann is professor of computer science at Jacobs University Bremen (www.jacobs-university.de). John Herring is a systems and software architect at Oracle (www.oracle.com). Charles Heazel is a systems engineer at WiSC Enterprises (www.wiscenterprises.com). Ron Exler is a senior consultant and writer with the OGC (www.opengeospatial.org)