DBpedia: Difference between revisions
JensLehmann (talk | contribs) changes latest release version and date |
Additional information and references |
||
Line 45: | Line 45: | ||
Version 2014 was released in September 2014.<ref name="Changelog">{{cite web|url=http://wiki.dbpedia.org/Changelog|title=Changelog|date=September 2014|publisher=DBpedia|accessdate=9 September 2014}}</ref> A main change since previous versions was the way abstract texts were extracted. Specifically, running a local mirror of Wikipedia and retrieving rendered abstracts from it made extracted texts considerably cleaner. Also, a new data set extracted from [[Wikimedia Commons]] was introduced. |
Version 2014 was released in September 2014.<ref name="Changelog">{{cite web|url=http://wiki.dbpedia.org/Changelog|title=Changelog|date=September 2014|publisher=DBpedia|accessdate=9 September 2014}}</ref> A main change since previous versions was the way abstract texts were extracted. Specifically, running a local mirror of Wikipedia and retrieving rendered abstracts from it made extracted texts considerably cleaner. Also, a new data set extracted from [[Wikimedia Commons]] was introduced. |
||
Nowadays, DBpedia is one of the biggest representatives of the [[LOD]]<ref>{{cite journal |last1=Lewoniewski |first1=Włodzimierz |date=2017-10-18 |title=Enrichment of Information in Multilingual Wikipedia Based on Quality Analysis |url=https://www.researchgate.net/publication/320448810_Enrichment_of_Information_in_Multilingual_Wikipedia_Based_on_Quality_Analysis |journal=Lecture Notes in Business Information Processing |volume=303 |issue= |pages=216-227 |doi=10.1007/978-3-319-69023-0_19 |access-date=2018-05-05 }}</ref>. |
|||
== Examples == |
== Examples == |
||
Line 65: | Line 67: | ||
[[Amazon.com|Amazon]] provides a DBpedia ''Public Data Set'' that can be integrated into [[Amazon Web Services]] applications.<ref name="AmazonDBpedia"/> |
[[Amazon.com|Amazon]] provides a DBpedia ''Public Data Set'' that can be integrated into [[Amazon Web Services]] applications.<ref name="AmazonDBpedia"/> |
||
Semantic structure of DBpedia with quality metrics can help in building methods for automatic enrichment of less developed language versions of Wikipedia.<ref>{{cite journal |last1=Lewoniewski |first1=Włodzimierz |last2=Węcel |first2=Krzysztof |last3=Abramowicz |first3=Witold |date=2017-12-08 |title=Relative Quality and Popularity Evaluation of Multilingual Wikipedia Articles |url=http://www.mdpi.com/2227-9709/4/4/43 |journal=Informatics |volume=4|issue=4 |pages= |doi=10.3390/informatics4040043 |access-date=2018-05-05 }}</ref> |
|||
== DBpedia Spotlight == |
== DBpedia Spotlight == |
Revision as of 10:26, 6 May 2018
Developer(s) | |
---|---|
Initial release | 10 January 2007 |
Stable release | DBpedia 2016-10
/ July 4, 2017 |
Repository | |
Written in | |
Operating system | Virtuoso Universal Server |
Type | |
License | GNU General Public License |
Website | dbpedia |
DBpedia (from "DB" for "database") is a project aiming to extract structured content from the information created in the Wikipedia project. This structured information is made available on the World Wide Web.[2] DBpedia allows users to semantically query relationships and properties of Wikipedia resources, including links to other related datasets.[3] Tim Berners-Lee described DBpedia as one of the most famous parts of the decentralized Linked Data effort.[4]
Background
The project was started by people at the Free University of Berlin and Leipzig University, in collaboration with OpenLink Software,[5] and the first publicly available dataset was published in 2007. It is made available under free licences (CC-BY-SA), allowing others to reuse the dataset; it doesn't however use an open data license to waive the sui generis database rights.
Wikipedia articles consist mostly of free text, but also include structured information embedded in the articles, such as "infobox" tables (the pull-out panels that appear in the top right of the default view of many Wikipedia articles, or at the start of the mobile versions), categorisation information, images, geo-coordinates and links to external Web pages. This structured information is extracted and put in a uniform dataset which can be queried.
Dataset
The DBpedia data set describes[when?] 4.58 million entities, out of which 4.22 million are classified in a consistent ontology, including 1,445,000 persons, 735,000 places, 123,000 music albums, 87,000 films, 19,000 video games, 241,000 organizations, 251,000 species and 6,000 diseases.[6] The data set features labels and abstracts for these entities in up to 125 languages; 25.2 million links to images and 29.8 million links to external web pages. In addition, it contains around 50 million links to other RDF datasets, 80.9 million links to Wikipedia categories, and 41.2 million YAGO2 categories.[6] DBpedia uses the Resource Description Framework (RDF) to represent extracted information and consists of 3 billion RDF triples, of which 580 million were extracted from the English edition of Wikipedia and 2.46 billion from other language editions.[6]
From this data set, information spread across multiple pages can be extracted, for example book authorship can be put together from pages about the work, or the author.[further explanation needed]
One of the challenges in extracting information from Wikipedia is that the same concepts can be expressed using different parameters in infobox and other templates, such as |birthplace=
and |placeofbirth=
. Because of this, queries about where people were born would have to search for both of these properties in order to get more complete results. As a result, the DBpedia Mapping Language has been developed to help in mapping these properties to an ontology while reducing the number of synonyms. Due to the large diversity of infoboxes and properties in use on Wikipedia, the process of developing and improving these mappings has been opened to public contributions.[7]
Version 2014 was released in September 2014.[8] A main change since previous versions was the way abstract texts were extracted. Specifically, running a local mirror of Wikipedia and retrieving rendered abstracts from it made extracted texts considerably cleaner. Also, a new data set extracted from Wikimedia Commons was introduced.
Nowadays, DBpedia is one of the biggest representatives of the LOD[9].
Examples
DBpedia extracts factual information from Wikipedia pages, allowing users to find answers to questions where the information is spread across multiple Wikipedia articles. Data is accessed using an SQL-like query language for RDF called SPARQL. For example, imagine you were interested in the Japanese shōjo manga series Tokyo Mew Mew, and wanted to find the genres of other works written by its illustrator. DBpedia combines information from Wikipedia's entries on Tokyo Mew Mew, Mia Ikumi and on works such as Super Doll Licca-chan and Koi Cupid. Since DBpedia normalises information into a single database, the following query can be asked without needing to know exactly which entry carries each fragment of information, and will list related genres:
PREFIX dbprop: <http://dbpedia.org/property/>
PREFIX db: <http://dbpedia.org/resource/>
SELECT ?who, ?WORK, ?genre WHERE {
db:Tokyo_Mew_Mew dbprop:author ?who .
?WORK dbprop:author ?who .
OPTIONAL { ?WORK dbprop:genre ?genre } .
}
Use cases
DBpedia has a broad scope of entities covering different areas of human knowledge. This makes it a natural hub for connecting datasets, where external datasets could link to its concepts.[10] The DBpedia dataset is interlinked on the RDF level with various other Open Data datasets on the Web. This enables applications to enrich DBpedia data with data from these datasets. As of September 2013[update], there are more than 45 million interlinks between DBpedia and external datasets including: Freebase, OpenCyc, UMBEL, GeoNames, MusicBrainz, CIA World Fact Book, DBLP, Project Gutenberg, DBtune Jamendo, Eurostat, UniProt, Bio2RDF, and US Census data.[11][12] The Thomson Reuters initiative OpenCalais, the Linked Open Data project of the New York Times, the Zemanta API and DBpedia Spotlight also include links to DBpedia.[13][14][15] The BBC uses DBpedia to help organize its content.[16][17] Faviki uses DBpedia for semantic tagging.[18] Samsung also includes DBpedia in its "Knowledge Sharing Platform".
Such a rich source of structured cross-domain knowledge is fertile ground for Artificial Intelligence systems. DBpedia was used as one of the knowledge sources in IBM Watson's Jeopardy! winning system [19]
Amazon provides a DBpedia Public Data Set that can be integrated into Amazon Web Services applications.[20]
Semantic structure of DBpedia with quality metrics can help in building methods for automatic enrichment of less developed language versions of Wikipedia.[21]
DBpedia Spotlight
DBpedia Spotlight is a tool for annotating mentions of DBpedia resources in text. This allows linking unstructured information sources to the Linked Open Data cloud through DBpedia. DBpedia Spotlight performs named entity extraction, including entity detection and name resolution (in other words, disambiguation). It can also be used for named entity recognition, and other information extraction tasks. DBpedia Spotlight aims to be customizable for many use cases. Instead of focusing on a few entity types, the project strives to support the annotation of all 3.5 million entities and concepts from more than 320 classes in DBpedia. The project started in June 2010 at the Web Based Systems Group at the Free University of Berlin.
DBpedia Spotlight is publicly available as a web service for testing and a Java/Scala API licensed via the Apache License. The DBpedia Spotlight distribution includes a jQuery plugin that allows developers to annotate pages anywhere on the Web by adding one line to their page.[22] Clients are also available in Java or PHP.[23] The tool handles various languages through its demo page[24] and web services. Internationalization is supported for any language that has a Wikipedia edition.[25]
See also
References
- ^ "Dbpedia.org on Alexa". Alexa Internet. Amazon.com. Retrieved 7 September 2016.
- ^ Bizer, Christian; Lehmann, Jens; Kobilarov, Georgi; Auer, Soren; Becker, Christian; Cyganiak, Richard; Hellmann, Sebastian (September 2009). "DBpedia - A crystallization point for the Web of Data" (PDF). Web Semantics: Science, Services and Agents on the World Wide Web. 7 (3): 154–165. doi:10.1016/j.websem.2009.07.002. ISSN 1570-8268.
- ^ "Komplett verlinkt — Linked Data" (in German). 3sat. 19 June 2009. Retrieved 10 November 2009.
- ^
"Sir Tim Berners-Lee Talks with Talis about the Semantic Web". Talis. 7 February 2008. Archived from the original on 10 May 2013.
{{cite web}}
: Unknown parameter|deadurl=
ignored (|url-status=
suggested) (help) - ^
"Credits". DBpedia. Archived from the original on 21 September 2014. Retrieved 2014-09-09.
{{cite web}}
: Unknown parameter|deadurl=
ignored (|url-status=
suggested) (help) - ^ a b c "DBpedia Version 2014 released". DBpedia. Retrieved 9 September 2014.
- ^ "DBpedia Mappings". mappings.dbpedia.org. Retrieved 3 April 2010.
- ^ "Changelog". DBpedia. September 2014. Retrieved 9 September 2014.
- ^ Lewoniewski, Włodzimierz (18 October 2017). "Enrichment of Information in Multilingual Wikipedia Based on Quality Analysis". Lecture Notes in Business Information Processing. 303: 216–227. doi:10.1007/978-3-319-69023-0_19. Retrieved 5 May 2018.
- ^ E. Curry, A. Freitas, and S. O’Riáin, "The Role of Community-Driven Data Curation for Enterprises," Archived 23 January 2012 at the Wayback Machine in Linking Enterprise Data, D. Wood, Ed. Boston, MA: Springer US, 2010, pp. 25-47.
- ^ "Statistics on links between Data sets", SWEO Community Project: Linking Open Data on the Semantic Web, W3C, retrieved 24 November 2009
- ^ "Statistics on Data sets", SWEO Community Project: Linking Open Data on the Semantic Web, W3C, retrieved 24 November 2009
- ^ Sandhaus, Evan; Larson, Rob (29 October 2009). "First 5,000 Tags Released to the Linked Data Cloud". NY Times Blogs. Retrieved 10 November 2009.
- ^
"Life in the Linked Data Cloud". www.opencalais.com. Archived from the original on 24 November 2009. Retrieved 10 November 2009.
Wikipedia has a Linked Data twin called DBpedia. DBpedia has the same structured information as Wikipedia – but translated into a machine-readable format.
{{cite web}}
: Unknown parameter|deadurl=
ignored (|url-status=
suggested) (help) - ^
"Zemanta talks Linked Data with SDK and commercial API". blogs.zdnet.com. Archived from the original on 28 February 2010. Retrieved 2009-11-10.
Zemanta fully supports the Linking Open Data initiative. It is the first API that returns disambiguated entities linked to dbPedia, Freebase, MusicBrainz, and Semantic Crunchbase.
{{cite web}}
: Unknown parameter|deadurl=
ignored (|url-status=
suggested) (help) - ^
"European Semantic Web Conference 2009 - Georgi Kobilarov, Tom Scott, Yves Raimond, Silver Oliver, Chris Sizemore, Michael Smethurst, Christian Bizer and Robert Lee. Media meets Semantic Web - How the BBC uses DBpedia and Linked Data to make Connections". www.eswc2009.org. Archived from the original on 8 June 2009. Retrieved 2009-11-10.
{{cite web}}
: Unknown parameter|deadurl=
ignored (|url-status=
suggested) (help) - ^
"BBC Learning - Open Lab - Reference". bbc.co.uk. Archived from the original on 25 August 2009. Retrieved 2009-11-10.
Dbpedia is a database version of Wikipedia. It is used in a lot of projects for a wide range of different reasons. At the BBC we are using it for tagging content.
{{cite web}}
: Unknown parameter|deadurl=
ignored (|url-status=
suggested) (help) - ^
"Semantic Tagging with Faviki". www.readwriteweb.com. Archived from the original on 29 January 2010.
{{cite web}}
: Unknown parameter|deadurl=
ignored (|url-status=
suggested) (help) - ^ David Ferrucci, Eric Brown, Jennifer Chu-Carroll, James Fan, David Gondek, Aditya A. Kalyanpur, Adam Lally, J. William Murdock, Eric Nyberg, John Prager, Nico Schlaefer, and Chris Welty "Building Watson: An Overview of the DeepQA Project." In AI Magazine Fall, 2010. Association for the Advancement of Artificial Intelligence (AAAI).
- ^ "Amazon Web Services Developer Community : DBpedia". developer.amazonwebservices.com. Retrieved 10 November 2009.
- ^ Lewoniewski, Włodzimierz; Węcel, Krzysztof; Abramowicz, Witold (8 December 2017). "Relative Quality and Popularity Evaluation of Multilingual Wikipedia Articles". Informatics. 4 (4). doi:10.3390/informatics4040043. Retrieved 5 May 2018.
{{cite journal}}
: CS1 maint: unflagged free DOI (link) - ^ Mendes, Pablo. "DBpedia Spotlight jQuery Plugin". jQuery Plugins. Retrieved 15 September 2011.
- ^ DiCiuccio, Rob. "PHP Client for DBpedia Spotlight". GitHub.
- ^ "Demo of DBpedia Spotlight". Retrieved 8 September 2013.
- ^ "Internationalization of DBpedia Spotlight". Retrieved 8 September 2013.
External links
- Official website
- TED Talks video (Adobe Flash) about the semantic web by Tim Berners-Lee, presenting DBpedia as an example, at TED
- DBpedia - Extracting structured data from Wikipedia and LinkedGeodata, Wikimania 2009 talks about the DBpedia project.
- DBpedia: Querying Wikipedia like a Database - Chris Bizer, World Wide Web Conference Developers Track, 11 May 2007
- W3C SWEO Linking Open Data Community Project