Self-supervised ARTMAP

被引:24
作者
Amis, Gregory P. [1 ]
Carpenter, Gail A. [1 ]
机构
[1] Boston Univ, Dept Cognit & Neural Syst, Boston, MA 02215 USA
关键词
Self-supervised learning; Supervised learning; Adaptive Resonance Theory (ART); ARTMAP; Unsupervised learning; Machine learning; NEURAL-NETWORK ARCHITECTURE; PATTERN-CLASSIFICATION; INFORMATION FUSION; SPATIAL-PATTERN; RECOGNITION; SEARCH;
D O I
10.1016/j.neunet.2009.07.026
中图分类号
TP18 [人工智能理论];
学科分类号
140502 [人工智能];
摘要
Computational models of learning typically train on labeled input patterns (supervised learning), unlabeled input patterns (unsupervised learning), or a combination of the two (semi-supervised learning). In each case input patterns have a fixed number of features throughout training and testing. Human and machine learning contexts present additional opportunities for expanding incomplete knowledge from formal training, via self-directed learning that incorporates features not previously experienced. This article defines a new self-supervised learning paradigm to address these richer learning contexts, introducing a neural network called self-supervised ARTMAP. Self-supervised learning integrates knowledge from a teacher (labeled patterns with some features), knowledge from the environment (unlabeled patterns with more features), and knowledge from internal model activation (self-labeled patterns). Self-supervised ARTMAP learns about novel features from unlabeled patterns without destroying partial knowledge previously acquired from labeled patterns. A category selection function bases system predictions on known features, and distributed network activation scales unlabeled learning to prediction confidence. Slow distributed learning on unlabeled patterns focuses on novel features and confident predictions, defining classification boundaries that were ambiguous in the labeled patterns. Self-supervised ARTMAP improves test accuracy on illustrative low-dimensional problems and on high-dimensional benchmarks. Model code and benchmark data are available from: http://techlab.eu.edu/SSART/. (C) 2009 Elsevier Ltd. All rights reserved.
引用
收藏
页码:265 / 282
页数:18
相关论文
共 31 条
[1]
AMIS GP, 2009, TR2009002 CASCNS BOS
[2]
Default ARTMAP 2 [J].
Amis, Gregory P. ;
Carpenter, Gail A. .
2007 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-6, 2007, :777-782
[3]
[Anonymous], 2006, BOOK REV IEEE T NEUR
[4]
[Anonymous], 2008, COMPUT SCI
[5]
Distributed ARTMAP: a neural network for fast distributed supervised learning [J].
Carpenter, GA ;
Milenova, BL ;
Noeske, BW .
NEURAL NETWORKS, 1998, 11 (05) :793-813
[6]
ART-EMAP - A NEURAL-NETWORK ARCHITECTURE FOR OBJECT RECOGNITION BY EVIDENCE ACCUMULATION [J].
CARPENTER, GA ;
ROSS, WD .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1995, 6 (04) :805-818
[7]
Distributed learning, recognition, and prediction by ART and ARTMAP neural networks [J].
Carpenter, GA .
NEURAL NETWORKS, 1997, 10 (08) :1473-1494
[8]
Self-organizing information fusion and hierarchical knowledge discovery: a new framework using ARTMAP neural networks [J].
Carpenter, GA ;
Martens, S ;
Ogas, OJ .
NEURAL NETWORKS, 2005, 18 (03) :287-295
[9]
Carpenter GA, 2003, IEEE IJCNN, P1396
[10]
ART-3 - HIERARCHICAL SEARCH USING CHEMICAL TRANSMITTERS IN SELF-ORGANIZING PATTERN-RECOGNITION ARCHITECTURES [J].
CARPENTER, GA ;
GROSSBERG, S .
NEURAL NETWORKS, 1990, 3 (02) :129-152