Contributed by 1Spatial
14/04/2021 - 1Spatial
There are two main elements to be considered in determining the utility value of the data. The first is the direct impact when data quality is below expectation, and the second is the lost opportunity cost which, although harder to measure, can often be the more significant.
We can recognise data quality impact under a number of headings:
To avoid these impacts it is necessary to put in place appropriate validation processes that allow us to assess, measure and control the quality of the data. Ideally the results should also be validated by an external data audit to verify the results and establish stakeholder confidence.
The primary objective must be to maximise the recognised utility value of the information, based on reducing the negative impacts that are consequences of not delivering “fit-for-purpose” data.
Although it may be difficult to quantify the monetary value of spatial data quality, we can readily see the impact it has in our daily lives, for example:
In extreme situations, for example when emergency services are sent to the wrong location or they find that they cannot get to the correct location due to incorrectly recorded infrastructure restrictions, accurate spatial data can be a matter of life or death.
The type of data that most people are used to dealing with is typically a record of a simple fact that is either right or wrong, such as Date of birth or Bank account balance. However, when we are working with spatial data representing real-world features, we are usually dealing with a simplified model that is intended to be used for particular activities. It may be fit for that specific purpose but may be totally inappropriate for another purpose. For example, a building that is represented as a point location may be perfectly adequate for identifying a delivery point for a courier service, but it is completely unsuited to an analysis of the percentage of land area that has been built on in a town. Likewise, a parcel of land that is captured from low resolution aerial photography and represented as a simplified polygon may be perfectly adequate for an analysis of land use in an environmental study, but it will be totally inadequate if it is to be used in the transaction for the sale transfer of a part of the land.
Spatial data is particularly sensitive to two specific quality measures:
The richness and complexity of spatial data models representing real-world features demands that the quality measures ensuring fitness-for-purpose are comprehensively determined and that robust, trusted processes to verify compliance are put in place. This allows a transition from fearing that the data is not of high enough quality – and suffering the effects – to knowing the exact level of quality and being able to measure and improve it.
1Spatial’s core business is in making geospatially referenced data current, accessible, easily shared and trusted. We have over 40 years of experience as a global expert; uniquely focused on the modelling, processing, transformation, management, interoperability, and maintenance of spatial data – all with an emphasis on data integrity, accuracy and on-going quality assurance. We have provided spatial data management and production solutions to a wide range of international mapping and cadastral agencies, government, utilities and defence organisations across the world. This gives us unique experience in working with a plethora of data (features, formats, structure, complexity, lifecycle, etc.) within an extensive range of enterprise-level system architectures.
Find out more by downloading our ‘Little Book of Spatial Data Quality’
All articles on this news site are submitted by registered contributors of NorfolkWire. Find out how to subscribe and submit your stories here »