Semantic Sensor Web

From Wikipedia, the free encyclopedia
Jump to: navigation, search

The Semantic Sensor Web (SSW) is a marriage of sensor and Semantic Web technologies. The encoding of sensor descriptions and sensor observation data with Semantic Web languages enables more expressive representation, advanced access, and formal analysis of sensor resources. The SSW annotates sensor data with spatial, temporal, and thematic semantic metadata. This technique builds on current standardization efforts within the Open Geospatial Consortium's Sensor Web Enablement (SWE)[1][2] and extends them with Semantic Web technologies to provide enhanced descriptions and access to sensor data.[3]

Semantic modeling and annotation of sensor data[edit]

Ontologies and other semantic technologies can be key enabling technologies for sensor networks because they will improve semantic interoperability and integration, as well as facilitate reasoning, classification and other types of assurance and automation not included in the Open Geospatial Consortium (OGC) standards. A semantic sensor network will allow the network, its sensors and the resulting data to be organised, installed and managed, queried, understood and controlled through high-level specifications. Ontologies for sensors provide a framework for describing sensors. These ontologies allow classification and reasoning on the capabilities and measurements of sensors, provenance of measurements and may allow reasoning about individual sensors as well as reasoning about the connection of a number of sensors as a macroinstrument. The sensor ontologies, to some degree, reflect the OGC standards and, given ontologies that can encode sensor descriptions, understanding how to map between the ontologies and OGC models is an important consideration. Semantic annotation of sensor descriptions and services that support sensor data exchange and sensor network management will serve a similar purpose as that espoused by semantic annotation of Web services. This research is conducted through the W3C Semantic Sensor Network Incubator Group (SSN-XG) activity.

W3C Semantic Sensor Networks[edit]

The World Wide Web Consortium (W3C) has initiated the Semantic Sensor Networks Incubator Group (SSN-XG) to develop the Semantic Sensor Network (SSN) ontology which can model sensor devices, systems, processes, and observations. The Incubator Group has now transitioned into the Semantic Sensor Networks Community Group.

The Semantic Sensor Network (SSN) ontology enables expressive representation of sensors, sensor observations, and knowledge of the environment. The SSN ontology is encoded in the Web Ontology Language (OWL) and has begun to achieve broad adoption and application within the sensors community. It is currently being used by various organizations, from academia, government, and industry, for improved management of sensor data on the Web, involving annotation, integration, publishing, and search (list of current applications).


Sensors around the globe currently collect avalanches of data about the world. The rapid development and deployment of sensor technology is intensifying the existing problem of too much data and not enough knowledge [1]. With a view to alleviating this glut, sensor data can be annotated with semantic metadata to increase interoperability between heterogeneous sensor networks, as well as to provide contextual information essential for situation awareness. Semantic web techniques can greatly help with the problem of data integration and discovery as it helps map between different metadata schema in a structured way.

Real-time extension, sensor wiki enablement via "Sensing Cloud"[edit]

Additionally, real-time extension of the Semantic Sensor Web concept is being developed, called sensor wiki. The motivation behind this concept is to allow real-time browsing of the physical world consistent with the STT situational awareness goal. Understanding the physical world via a myriad of sensors is now possible. Browsing of the current physical reality with the help of each sensor as a real-time Web page can help common users, their neighborhoods become more aware of their situations and can probe further into objects and related analytics of interest by tapping into relevant information stored somewhere online, in effect, enabling potential for collective self-aware "intelligent neighborhoods". This can become basis for collaborative, actionable situational awareness.

As IP-enabled, affordable sensor devices of different types become available they are placed around the Earth, referred to as a "Sensing Cloud" in our environment. Integrating the diverse sensory streams into the Web can serve different user or machine queries via the concept of sensor wiki. Encouraging people to contribute real-time "sensory" information is the goal subject to privacy and security constraints. Intelligent mobile devices can act as hubs and/or sources and sinks of such real-time streams.

In a sensor wiki one or more sensors contribute real-time information as wiki pages with suitable themes and formats useful to prospective sensor wiki users. Sensor wiki users can look up information about objects, events, or places of interest interactively; they can also add intelligent STT interpretations of what they observe or use sensor tasking to add to the content to improve accuracy, or even develop the overall scene to offer situation assessment on a proactive basis. Others might want to record such sensor streams and related information as part of a larger objective such as future planning, training, or simply record keeping for historical purposes, and make it available to a specific community or an individual.

Sports events and related sports medicine would be good as examples to demonstrate this concept. Players, fans, and all the supporting communities could participate and benefit.[citation needed]

See also[edit]

Further reading[edit]

  • Michael Compton, Payam Barnaghi, Luis Bermudez, Raul Garcia-Castro, Oscar Corcho, Simon Cox, John Graybeal, Manfred Hauswirth, Cory Henson, Arthur Herzog, Vincent Huang, Krzysztof Janowicz, W. David Kelsey, Danh Le Phuoc, Laurent Lefort, Myriam Leggieri, Holger Neuhaus, Andriy Nikolov, Kevin Page, Alexandre Passant, Amit Sheth, Kerry Taylor. 'The SSN Ontology of the W3C Semantic Sensor Network Incubator Group.' Journal of Web Semantics, 2012.[2]
  • Lefort, L., Henson, C., Taylor, K., Barnaghi, P., Compton, M., Corcho, O., Garcia-Castro, R., Graybeal, J., Herzog, A., Janowicz, K., Neuhaus, H., Nikolov, A., and Page, K.: Semantic Sensor Network XG Final Report, W3C Incubator Group Report (2011). [3]
  • Amit Sheth, Cory Henson, and Satya Sahoo, "Semantic Sensor Web," IEEE Internet Computing, July/August 2008, p. 78-83. [4]
  • Manfred Hauswirth and Stefan Decker, "Semantic Reality - Connecting the Real and the Virtual World," Microsoft SemGrail Workshop, Redmond, Washington, June 21–22, 2007. [5]
  • Cory Henson, Josh Pschorr, Amit Sheth, and Krishnaprasad Thirunarayan, “SemSOS: Semantic Sensor Observation Service,” International Symposium on Collaborative Technologies and Systems (CTS2009), Workshop on Sensor Web Enablement (SWE2009), Baltimore, Maryland, 2009. [6]