An Inherently Nonnegative Latent Factor Model for High-Dimensional and Sparse Matrices from Industrial Applications

被引:152
作者
Luo, Xin [1 ]
Zhou, MengChu [2 ,3 ]
Li, Shuai [4 ]
Shang, MingSheng [1 ]
机构
[1] Chinese Acad Sci, Chongqing Inst Green & Intelligent Technol, Chongqing Key Lab Big Data & Intelligent Comp, Chongqing 400714, Peoples R China
[2] Macau Univ Sci & Technol, Inst Syst Engn, Macau 999078, Peoples R China
[3] New Jersey Inst Technol, Dept Elect & Comp Engn, Newark, NJ 07102 USA
[4] Hong Kong Polytech Univ, Dept Comp, Hong Kong, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Big data; high-dimensional and sparse matrix; learning algorithms; missing-data estimation; nonnegative latent factor analysis; optimization methods recommender system; SUBGRADIENT METHODS; FACTORIZATION; RECOMMENDER; ALGORITHM; CONVERGENCE; SYSTEMS;
D O I
10.1109/TII.2017.2766528
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
High-dimensional and sparse (HiDS) matrices are commonly encountered in many big-data-related and industrial applications like recommender systems. When acquiring useful patterns from them, nonnegative matrix factorization (NMF) models have proven to be highly effective owing to their fine representativeness of the nonnegative data. However, current NMF techniques suffer from: 1) inefficiency in addressing HiDS matrices; and 2) constraints in their training schemes. To address these issues, this paper proposes to extract nonnegative latent factors (NLFs) from HiDS matrices via a novel inherently NLF (INLF) model. It bridges the output factors and decision variables via a single-element-dependent mapping function, thereby making the parameter training unconstrained and compatible with general training schemes on the premise of maintaining the nonnegativity constraints. Experimental results on six HiDS matrices arising from industrial applications indicate that INLF is able to acquire NLFs from them more efficiently than any existing method does.
引用
收藏
页码:2011 / 2022
页数:12
相关论文
共 75 条
[1]   Toward the next generation of recommender systems: A survey of the state-of-the-art and possible extensions [J].
Adomavicius, G ;
Tuzhilin, A .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2005, 17 (06) :734-749
[2]  
[Anonymous], 2011, Proceedings of the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD)
[3]  
[Anonymous], 2010, P 4 ACM C REC SYST, DOI DOI 10.1145/1864708.1864736
[4]  
[Anonymous], 1999, Athena scientific Belmont
[5]  
[Anonymous], 2011, Advances in Neural Information Processing Systems
[6]  
[Anonymous], 2009, CONVEX OPTIMIZATION
[7]  
[Anonymous], 2010, ICML
[8]   Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods [J].
Attouch, Hedy ;
Bolte, Jerome ;
Svaiter, Benar Fux .
MATHEMATICAL PROGRAMMING, 2013, 137 (1-2) :91-129
[9]  
Basanta-Val Pablo, 2016, IEEE Transactions on Big Data, V2, P310, DOI 10.1109/TBDATA.2016.2622719
[10]  
Basanta-Val P., 2015, SCIENCIEDIRET, V52, P22, DOI DOI 10.1016/j.future.2015.03.023