Written by Jeff Thurston .
I live in the centre of Berlin, Germany and routinely interact with many people from all over the world who use maps in different ways. It can be a remarkable experience to watch and listen to different interpretations of the same map from different people.
Some people read the words, others look at and use the symbology and yet others will trace pathways with their fingers. For some reason people often do not trust their own experiences when map reading and ask others, as if to verify their interpretations.
These differences lend support to the fact that a map is highly linked to analysis. In this case human relationships to the landscape. What people see and interpret is based upon several factors. Their experience(s), knowledge, interpretation and even expectations.
It would be interesting to provide a tour group a map of a place, then blank pieces of paper after a few days and then ask them to draw a map of what they think the area they have been in looks like.
I’ve little doubt that we would get 30 or 40 pieces of paper back that look differently. Roads would be in different places, cultural highlights created, most impressionable places indicated and good food locations, for example, would be identified.
Now this experience varies with consumer grade GPS tours and web maps that show the same map for all people. And, the experience of crowdsourcing is more or less about ‘marking-up’ someone elses map, by writing on it or usually applying a dot and identifying it by text.
This is where the difference between modern day technologies and traditional paper have a wide divergence. While web mapping implies use, just as a paper map does; providing blank papers requires creativity and interpretation and is more aligned to the design experience (until we get a real good digital drawing to map program).
The individual pieces of paper (interpretations) could form the basis for individual layers. In a GIS application these could be analysed – then understood and communicated.
The process of working with a GIS will necessarily involve the creation of many maps. The operator will inevitably sift and sort through the many graphics presented before finally arriving at solutions. We sometimes identify this process as the solution ‘workflow’ and it may vary by individual. Conversely, more standardised workflows would take several layers of spatial information and data from external sources and process them in standard ways to derive solutions. We often refer to these workflows as ‘Best Practices’ – they yield optimal results time and time again based on proven analytic methods.
There are benefits to standardised approaches. These include speed, expectations are aligned more often, costs are reduced and processing usually considers integral factors.
But like the individual maps people hold in their hands in the centre of Berlin, something unique (and perhaps valuable) can lie outside the Best Practice.
I’m not advocating that Best Practices should be avoided by any means, no. But what I am suggesting is this.
If you have high quality data, a unique data model and excellent software that is optimised to perform for specialised circumstances (hydrological models, energy modelling, building designs etc.) then it logically does not follow that we melt all this performance into standardised ‘observations’ – which then form ‘only practices’.
In this respect individual pieces of hardware and software can be likened to people interpreting maps and cartographic products – each performing optimally but deriving slightly different results.
It should be evident that being able to transform data from one experience to another experience (or even integrate it) is a highly useful and valuable function. Just as two people’s ideas about a map are better than one, the integration of CAD with GIS data from Product A and Product B also has great merits.
Cartographic output is not solely about representation, then consumption before reaching an endpoint. Instead, it is about following that consumption with further integration, analysis and interpretation.
Have you ever thought that your data or cartography is not as useful or valuable, until someone else’s data (format or otherwise) has been integrated with your own for further synthesis?
This is a good reason why products that enable this to happen like ETL, compression software and the like are much more valuable then we realise sometimes.
They help to link cartography to spatial analysis.