Unsupervised Learning by Probabilistic Latent Semantic Analysis

被引:198
作者
Thomas Hofmann
机构
[1] Brown University,Department of Computer Science
来源
Machine Learning | 2001年 / 42卷
关键词
unsupervised learning; latent class models; mixture models; dimension reduction; EM algorithm; information retrieval; natural language processing; language modeling;
D O I
暂无
中图分类号
学科分类号
摘要
This paper presents a novel statistical method for factor analysis of binary and count data which is closely related to a technique known as Latent Semantic Analysis. In contrast to the latter method which stems from linear algebra and performs a Singular Value Decomposition of co-occurrence tables, the proposed technique uses a generative latent class model to perform a probabilistic mixture decomposition. This results in a more principled approach with a solid foundation in statistical inference. More precisely, we propose to make use of a temperature controlled version of the Expectation Maximization algorithm for model fitting, which has shown excellent performance in practice. Probabilistic Latent Semantic Analysis has many applications, most prominently in information retrieval, natural language processing, machine learning from text, and in related areas. The paper presents perplexity results for different types of text and linguistic data collections and discusses an application in automated document indexing. The experiments indicate substantial and consistent improvements of the probabilistic method over standard Latent Semantic Analysis.
引用
收藏
页码:177 / 196
页数:19
相关论文
共 32 条
  • [1] Berry M. W.(1995)Using linear algebra for intelligent information retrieval SIAM Review 37 573-595
  • [2] Dumais S. T.(1990)Indexing by latent semantic analysis Journal of the American Society for Information Science 41 391-407
  • [3] O'Brien G. W.(1977)Maximum likelihood from incomplete data via the EM algorithm J. Royal Statist. Soc. B 39 1-38
  • [4] Deerwester S.(1992)An analysis of information filtering methods Communications of the ACM 35 51-60
  • [5] Dumais G. W.(1986)Canonical analysis of contingency tables by maximum likelihood Journal of the American Statistical Association 81 780-788
  • [6] Furnas S. T.(1987)Estimation of probabilities for sparse data for the language model component of a speech recogniser IEEE Transactions on Acoustics, Speech and Signal Processing 35 400-401
  • [7] Landauer T. K.(1999)Learning the parts of objects by non-negative matrix factorization Nature 401 788-791
  • [8] Harshman R.(1990)A deterministic annealing approach to clustering Pattern Recognition Letters 11 589-594
  • [9] Dempster A. P.(1988)Deterministic annealing EM algorithm Neural Networks 11 271-282
  • [10] Laird N. M.(1991)The zero-frequency problem—estimating the probabilities of novel events in adaptive text compression IEEE Transactions on Information Theory 37 1085-1094