- April 14, 2009
- Posted by: EARSC
- Category: EARSC News
Written by Jeff Thurston, Friday, 10 April 2009.
Ten year’s ago real-time geodata was more or less a dream. Five year’s ago most people began to see more real-time data and were talking about it, but it was still something for the adventurous with deep pockets, who could afford to acquire it, process it and distribute it in a useful manner. This is changing as digital sensors, hardware and software are becoming more highly integrated and available.
While analog measurements have long been possible, the integration of spatial information in digital format for many applications often required retrofits, adapting technology, timely conversion or replacement as maintenance costs outpaced operational benefits.
While older applications were often making slower transitions, newer applications could begin with a digital start, embedding high-technologies into many operations that could deliver continuous data streams, often with higher accuracy and lower cost.
The shift to new technologies can be slow in some operations. The high cost of replacement, the need to train staff throughout an organisation and the need to ensure continuous operations are all realities facing those companies seeking to develop new approaches with updated technologies.
Ask anyone involved in the utility space and they will tell you that change is slow moving for those reasons, and that that is why some compnaies are locked in and resistant to change – even in the face of obvious benefits.
When we look at many geospatial applications today we can see they often speak about workflows – the movement of data across organisational processes and tasks toward solutions. We speak about individual measurements in terms of systematic work processes that combine, integrate, process and output results.
An example of this includes image capture, processing of the imagery and it’s distribution. While this sounds simple, in practice all of these steps may entail tasks originating at different places around the world by numerous people, the internet being the integrating factor. As a result, geospatial applications by nature of workflows are dynamic in nature, often automatic and spatial in operation, operationally.
We often consider the value of real-time geodata solely on the basis of up-to-date information. The benefit in this case lies in the ability to make decisions, based on the changing data, more quickly and accurately. This is understood and highly beneficial and important, as well as critical in some cases. But let’s return to workflows and their nature.
Let’s say that you have invested time and resources into developing a systematic workflow processing environment that is capable of handling spatial data operations. While you can deliver, for example, up-to-date decision making information on river flows to your customer’s, the investment made in the resources to enable this are also useful for processing other peoples information. Think of you investment and capabilities like an airliner. Most airlines don’t let their aircraft idle on the ground, they keep them in the air – even if they aren’t their own customer’s.
My point is that real-time data is more than a solution. It is a ‘service’ and needs to be considered as an airplane constantly in the air, moving people (value and tasks) to final destinations.
If you are setting up a web service for handling geospatial data, then you are setting up tools with a multi-functional capability for different purposes. You could be handling river flow data in real-time just as well as rainfall run-off, wind speed or energy analysis.
In real-time many geodata flows many not require a lot of geo-processing functionality, energy or use. Perhaps when it is raining the data needs more processing in real-time, or when energy demand is higher or at peak loads it needs greater balancing across a network. But understanding the nature of the real-time data is important, and goes back to the point of investing or upgrading – and the benefits for making the change.
Once the investment in updated technology has been made, then doors open to new possibilities for handling geodata internally and externally. One of the greatest values lies in human capital because with an appropriately trained staff operating web services, then a wider variety of services and options can be developed and offered, thereby capitalising upon the investment.
Real-time information is about scale
As the amount of real-time geodata entering a workflow increases, then a greater amount of time must be spent in balancing that load to ensure the overall system can handle the information efficiently and effectively.
Is real-time geodata important? Yes – because the value of that information not only enables more effective decision making, but because it has the potential to expand opportunities as systems are upgraded and put into operation.
There is no shortage of data and many people install sensor technologies and set them to output huge quantities of sometimes useless information. With careful planning, a new system can be installed that will handle not only internal real-time needs, but those outside the organisation.
I suppose what I am suggesting is that the nature of a business will change as new technology is integrated. Why not start your own ‘emerging technology web service’ group? Why not offer services that benefit from your human capital knowledge and newly acquired and upgraded technologies? Where are the emerging synergies?
Real-time geodata is important because it often results in the development of a systematic series of workflows and technologies that are capable of meeting a multitude of applications. These applications may exist within an organisation or external to it – but they are opportunties.
Note: This column alternates weekly between Vector1 Media editors. Jeff Thurston is editor of V1 Magazine and V1 Energy for Europe, Middle East, Africa and Russia.