Type-Aware Question Answering over Knowledge Base with Attention-Based Tree-Structured Neural Networks

被引:17
作者
Yin, Jun [1 ]
Zhao, Wayne Xin [2 ,3 ]
Li, Xiao-Ming [1 ]
机构
[1] Peking Univ, Sch Elect Engn & Comp Sci, Beijing 100871, Peoples R China
[2] Renmin Univ China, Sch Informat, Beijing 100872, Peoples R China
[3] Guangdong Key Lab Big Data Anal & Proc, Guangzhou 510006, Guangdong, Peoples R China
基金
北京市自然科学基金; 中国国家自然科学基金;
关键词
question answering; deep neural network; knowledge base;
D O I
10.1007/s11390-017-1761-8
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
080201 [机械制造及其自动化];
摘要
Question answering (QA) over knowledge base (KB) aims to provide a structured answer from a knowledge base to a natural language question. In this task, a key step is how to represent and understand the natural language query. In this paper, we propose to use tree-structured neural networks constructed based on the constituency tree to model natural language queries. We identify an interesting observation in the constituency tree: different constituents have their own semantic characteristics and might be suitable to solve different subtasks in a QA system. Based on this point, we incorporate the type information as an auxiliary supervision signal to improve the QA performance. We call our approach type-aware QA. We jointly characterize both the answer and its answer type in a unified neural network model with the attention mechanism. Instead of simply using the root representation, we represent the query by combining the representations of different constituents using task-specific attention weights. Extensive experiments on public datasets have demonstrated the effectiveness of our proposed model. More specially, the learned attention weights are quite useful in understanding the query. The produced representations for intermediate nodes can be used for analyzing the effectiveness of components in a QA system.
引用
收藏
页码:805 / 813
页数:9
相关论文
共 26 条
[1]
[Anonymous], 2014, Advances in Neural Information Processing Systems
[2]
[Anonymous], 2014, P 2014 C EMPIRICAL M
[3]
[Anonymous], 2015, Large-scale simple question answering with memory networks
[4]
[Anonymous], 2015, P 2015 C N AM CHAPT
[5]
[Anonymous], 2013, ADV NEURAL INF PROCE
[6]
[Anonymous], 2014, P C EMP METH NAT LAN
[7]
[Anonymous], 2015, ARXIV150300075
[8]
Bast Hannah, 2015, P 24 ACM INT C INFO, DOI [DOI 10.1145/2806416.2806472, 10.1145/2806416.2806472]
[9]
Berant J, 2013, P C EMP METH NAT LAN, V2
[10]
Semantic Parsing via Paraphrasing [J].
Berant, Jonathan ;
Liang, Percy .
PROCEEDINGS OF THE 52ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, 2014, :1415-1425