Experimental explorations on short text topic mining between LDA and NMF based Schemes

被引:132
作者
Chen, Yong [1 ,2 ,3 ]
Zhang, Hui [1 ,2 ,3 ]
Liu, Rui [1 ,3 ]
Ye, Zhiwen [1 ,3 ]
Lin, Jianying [1 ,3 ]
机构
[1] Beihang Univ, State Key Lab Software Dev Environm, Beijing 100191, Peoples R China
[2] Beihang Univ, Beijing Adv Innovat Ctr Big Data & Brain Comp, Beijing 100191, Peoples R China
[3] Beihang Univ, Sch Comp Sci & Engn, Beijing 100191, Peoples R China
关键词
Short text mining; Topic modeling; Latent dirichlet allocation (LDA); Non-negative matrix factorization (NMF); Knowledge-based learning;
D O I
10.1016/j.knosys.2018.08.011
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learning topics from short texts has become a critical and fundamental task for understanding the widely-spread streaming social messages, e.g., tweets, snippets and questions/answers. Up to date, there are two distinctive topic learning schemes: generative probabilistic graphical models and geometrically linear algebra approaches, with LDA and NMF being the representative works, respectively. Since these two methods both could uncover the latent topics hidden in the unstructured short texts, some interesting doubts are coming to our minds that which one is better and why? Are there any other more effective extensions? In order to explore valuable insights between LDA and NMF based learning schemes, we comprehensively conduct a series of experiments into two parts. Specifically, the basic LDA and NMF are compared with different experimental settings on several public short text datasets in the first part which would exhibit that NMF tends to perform better than LDA; in the second part, we propose a novel model called "Knowledge-guided Non-negative Matrix Factorization for Better Short Text Topic Mining" (abbreviated as KGNMF), which leverages external knowledge as a semantic regulator with low-rank formalizations, yielding up a time-efficient algorithm. Extensive experiments are conducted on three representative corpora with currently typical short text topic models to demonstrate the effectiveness of our proposed KGNMF. Overall, learning with NMF-based schemes is another effective manner in short text topic mining in addition to the popular LDA-based paradigms. (C) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:1 / 13
页数:13
相关论文
共 41 条
[1]  
[Anonymous], AAAI SPRING S MACH R
[2]  
[Anonymous], 2003, HIERARCHICAL TOPIC M
[3]  
[Anonymous], ACM T INF SYST
[4]  
[Anonymous], 2014, P INT C INT C MACH L
[5]  
[Anonymous], GEN POLYAURN TIMEVAR
[6]  
Blei DM, 2002, ADV NEUR IN, V14, P601
[7]   Latent Dirichlet allocation [J].
Blei, DM ;
Ng, AY ;
Jordan, MI .
JOURNAL OF MACHINE LEARNING RESEARCH, 2003, 3 (4-5) :993-1022
[8]  
Chang J., 2009, Adv. Neural Inf. Process. Syst., P288
[9]  
Chen MR, 2013, J APPL PROBAB, V50, P1169
[10]   Modeling Emerging, Evolving and Fading Topics using Dynamic Soft Orthogonal NMF with Sparse Representation [J].
Chen, Yong ;
Zhang, Hui ;
Wu, Junjie ;
Wang, Xingguang ;
Liu, Rui ;
Lin, Mengxiang .
2015 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2015, :61-70