DBpedia Commons: Structured Multimedia Metadata from the Wikimedia Commons

被引:11
作者
Vaidya, Gaurav [1 ]
Kontokostas, Dimitris [2 ]
Knuth, Magnus [3 ]
Lehmann, Jens [2 ]
Hellmann, Sebastian [2 ]
机构
[1] Univ Colorado Boulder, Boulder, CO USA
[2] Univ Leipzig, Comp Sci, AKSW, D-04109 Leipzig, Germany
[3] Univ Potsdam, Hasso Plattner Inst, Potsdam, Germany
来源
SEMANTIC WEB - ISWC 2015, PT II | 2015年 / 9367卷
关键词
Wikimedia commons; DBpedia; Multimedia; RDF;
D O I
10.1007/978-3-319-25010-6_17
中图分类号
TP18 [人工智能理论];
学科分类号
140502 [人工智能];
摘要
The Wikimedia Commons is an online repository of over twenty-five million freely usable audio, video and still image files, including scanned books, historically significant photographs, animal recordings, illustrative figures and maps. Being volunteer-contributed, these media files have different amounts of descriptive metadata with varying degrees of accuracy. The DBpedia Information Extraction Framework is capable of parsing unstructured text into semi-structured data from Wikipedia and transforming it into RDF for general use, but so far it has only been used to extract encyclopedia-like content. In this paper, we describe the creation of the DBpedia Commons (DBc) dataset, which was achieved by an extension of the Extraction Framework to support knowledge extraction from Wikimedia Commons as amedia repository. To our knowledge, this is the first complete RDFization of the Wikimedia Commons and the largest media metadata RDF database in the LOD cloud.
引用
收藏
页码:281 / 289
页数:9
相关论文
共 5 条
[1]
[Anonymous], 2014, Semantic Web Journal
[2]
Brummer M., 2014, P 10 INT C SEM SYST, P84
[3]
Internationalization of Linked Data: The case of the Greek DBpedia edition [J].
Kontokostas, Dimitris ;
Bratsas, Charalampos ;
Auer, Soeren ;
Hellmann, Sebastian ;
Antoniou, Ioannis ;
Metakides, George .
JOURNAL OF WEB SEMANTICS, 2012, 15 :51-61
[4]
Lukovnikov D., 2014, P LINK DAT WEB 2014
[5]
Troncy R., 2012, MEDIA FRAGMENTS URI