From simple visualizations to sophisticated interactive tools, there is a growing reliance on data. Location information, or spatial data, is often a common thread running through such data; describing how things are positioned relative to the Earth in terms of coordinates and/or topology.- Spatial Data on the Web (WWW2017)
Spatial Data on the Web Best Practices
Spatial Data on the Web Best Practices document:
What is the problem this working group is trying to solve:
What is the quality of the spatial data information?
Sample search: Google search for Beaches in the United Kingdom Does this search result bring the relevant spatial data?
Inspire Geoportal has the better information on the spatial data for UK coastline.
Inspire and Google are two different protocols and resources. The results are different.
The geospatial industry has developed its own web services to publish location information.
Data is implicit and unstructured. Is data really â€œon the webâ€ if you canâ€™t find it via a search engine?
Most web content about places and location is unstructured. Harvesting requires sophisticated Natural Language ProcessingÂ and inference. It does not scale.
Linked Open Data
Linked Geospatial Consortium and W3C combined to developer Linked Geospatial Data.
- how should we encode geometry?
- how and where should we implement topological functions
- geometries express as WKT literals – large objects.
- 2D or 3D
Spatial Web Data Working Group
The mission of the Spatial Data on the Web Working Group is to clarify and formalize the relevant standards landscape. In particular:
- to determine how spatial information can best be integrated with other data on the Web;
- to determine how machines and people can discover that different facts in different datasets relate to the same place, especially when ‘place’ is expressed in different ways and at different levels of granularity;
- to identify and assess existing methods and tools and then create a set of best practices for their use; where desirable, to complete the standardization of informal technologies already in widespread use.
Best Practices Summary
- Best Practice 1: Provide metadata
- Best Practice 2: Provide descriptive metadata
- Best Practice 3: Provide structural metadata
- Best Practice 4: Provide data license information
- Best Practice 5: Provide data provenance information
- Best Practice 6: Provide data quality information
- Best Practice 7: Provide a version indicator
- Best Practice 8: Provide version history
- Best Practice 9: Use persistent URIs as identifiers of datasets
- Best Practice 10: Use persistent URIs as identifiers within datasets
- Best Practice 11: Assign URIs to dataset versions and series
- Best Practice 12: Use machine-readable standardized data formats
- Best Practice 13: Use locale-neutral data representations
- Best Practice 14: Provide data in multiple formats
- Best Practice 15: Reuse vocabularies, preferably standardized ones
- Best Practice 16: Choose the right formalization level
- Best Practice 17: Provide bulk download
- Best Practice 18: Provide Subsets for Large Datasets
- Best Practice 19: Use content negotiation for serving data available in multiple formats
- Best Practice 20: Provide real-time access
- Best Practice 21: Provide data up to date
- Best Practice 22: Provide an explanation for data that is not available
- Best Practice 23: Make data available through an API
- Best Practice 24: Use Web Standards as the foundation of APIs
- Best Practice 25: Provide complete documentation for your API
- Best Practice 26: Avoid Breaking Changes to Your API
- Best Practice 27: Preserve identifiers
- Best Practice 28: Assess dataset coverage
- Best Practice 29: Gather feedback from data consumers
- Best Practice 30: Make feedback available
- Best Practice 31: Enrich data by generating new data
- Best Practice 32: Provide Complementary Presentations
- Best Practice 33: Provide Feedback to the Original Publisher
- Best Practice 34: Follow Licensing Terms
- Best Practice 35: Cite the Original Publication
Best practices for Open Spatial Data (some have been deleted)
- Best Practice 1: Include spatial metadata in dataset metadata
- Best Practice 3: Choose the coordinate reference system to suit your user’s applications
- Best Practice 4: Make your spatial data indexable by search engines
- Best Practice 5: Describe the positional accuracy of spatial data
- Best Practice 6: Describe properties that change over time
- Best Practice 7: Use globally unique persistent HTTP URIs for spatial things
- Best Practice 8: Provide geometries on the Web in a usable way
- Best Practice 9: Describe relative positioning
- Best Practice 10: Encoding spatial data
- Best Practice 11: Expose spatial data through ‘convenience APIs’
- Best Practice 14: Publish links between spatial things and related resources
- Best Practice 17: State how coordinate values are encoded
Sensors, Satellites and Linking the Earth
Kerry Taylor (Australian National University, University of Surrey)
Semantic Sensor Networks (SSN)
SSN Ontology: This ontology describes sensors and observations, and related concepts. It does not describe domain concepts, time, locations, etc. these are intended to be included from other ontologies via OWL imports. This ontology is developed by the W3C Semantic Sensor Networks Incubator Group (SSN-XG).
Semantic sensor networks: Capabilities of sensors, measuring capability, sensors in systems, act and method of sensing, the results of sensing, and observations.
SOSA a.k.a. SSN Core
- Actuator and actuation
- Observation and its result
- Samples an sampling
- A little about sensors
- No inference, just assertions
- sensing and actuating capabilities
- properties of measurements
- systems of sensors and actuators
- operating conditions, survival conditions
- sensor behavior
- typing constraints and class relations
- supports inference to fill gaps
Giant size of satellite data has problems
- granularity of tiles, not pixels
- smallish metadata served through an ordinary triple store (linked data)
- hugs raster data
- specialized query processing
- simple optimisations applied
Australia’s distributed national spatialÂ dataset production systems and community
by Nicholas CarÂ (Geoscience Australia)
ANZLIC: the Spatial Information Council is the peak intergovernmental organisation providing leadership in the collection, management and use of spatial information in Australia and New Zealand.
Federal an state initiative to streamline the production of national spatial data products.
- for the Dept. Prime Minister and Cabinet
- coordinated by Geoscience Australia
FSDF is not:
- an awesome tool. It is not â€œsmartâ€, â€œmachine learningâ€, etc.
- Itâ€™s not a fast, non-permanent â€œinitiativeâ€. It is building a foundation
- run by one agency or achieve success via one channel.
Introducing NZ to a new paradigm for spatial data
Byron Cochrane (Department of Internal Affairs, New Zealand Government)
What is special about New Zealand
- New Zealand is rather isolated and tends to build there own solutions.
- It is very unstable with earth quakes
- aligning data sets is continually changing.
What is special about spatial data
- the better you know it, the better you can use it
- itâ€™s popular
Old paradigm of spatial data
- kicked off in the 90â€™s
- centrally managed and tightly controlled
- geoSpatial data engineers wanted a lot of access and data storage.
- geo data was walled off by IT specialistws
- Communication became easier between other geospatial data developers than the IT system that has isolated them.
- spatial data was put into relational databases, but these were not very effective
- a custom front end was created for the spatial data, but this fragile and exclusive.
- IT systems began connecting to the web, but the geospatial data had to take their own path to exposing data
- Spatial data was exposed via XML and SOAP
- Move towards the Linked Data space.
- To do this, it was necessary to reach out to the global community.
- Use modern standards
- integrate linked data best practices