The sample complexity of learning fixed-structure Bayesian networks

被引:34
作者
Dasgupta, S [1 ]
机构
[1] Univ Calif Berkeley, Dept Comp Sci, Berkeley, CA 94720 USA
关键词
Bayesian networks; PAC learning; sample complexity;
D O I
10.1023/A:1007417612269
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the problem of PAC learning probabilistic networks in the case where the structure of the net is specified beforehand. We allow the conditional probabilities to be represented in any manner (as tables or specialized functions) and obtain sample complexity bounds for learning nets with and without hidden nodes.
引用
收藏
页码:165 / 180
页数:16
相关论文
共 10 条
[1]  
ABE N, 1990, ACM C COMP LEARN THE
[2]  
[Anonymous], 1982, ESTIMATION DEPENDENC
[3]  
[Anonymous], P 12 C UNC ART INT S
[4]  
Cover T. M., 2005, ELEM INF THEORY, DOI 10.1002/047174882X
[5]   CENTRAL LIMIT-THEOREMS FOR EMPIRICAL MEASURES [J].
DUDLEY, RM .
ANNALS OF PROBABILITY, 1978, 6 (06) :899-929
[6]   DECISION THEORETIC GENERALIZATIONS OF THE PAC MODEL FOR NEURAL NET AND OTHER LEARNING APPLICATIONS [J].
HAUSSLER, D .
INFORMATION AND COMPUTATION, 1992, 100 (01) :78-150
[7]  
Pearl, 1989, MORGAN KAUFMANN SERI, DOI DOI 10.1016/C2009-0-27609-4
[8]  
Pollard David, 2012, CONVERGENCE STOCHAST
[9]  
RUSSELL S, 1995, P 14 INT JOINT C ART
[10]   A THEORY OF THE LEARNABLE [J].
VALIANT, LG .
COMMUNICATIONS OF THE ACM, 1984, 27 (11) :1134-1142