DD-HDS: A method for visualization and exploration of high-dimensional data

被引:45
作者
Lespinats, Sylvain [1 ]
Verleysen, Michel
Giron, Alain
Fertil, Bernard
机构
[1] Univ Paris 06, INSERM, UMR 678, F-75634 Paris, France
[2] Univ Paris 01, F-75634 Paris 13, France
[3] Catholic Univ Louvain, B-1348 Louvain, Belgium
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2007年 / 18卷 / 05期
关键词
high-dimensional data; multidimensional scaling (MDS); neighborhood visualization; nonlinear mapping;
D O I
10.1109/TNN.2007.891682
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Mapping high-dimensional data in a low-dimensional space, for example, for visualization, is a problem of increasingly major concern in data analysis. This paper presents data-driven high-dimensional scaling (DD-HDS), a nonlinear mapping method that follows the line of multidimensional scaling (MDS) approach, based on the preservation of distances between pairs of data. It improves the performance of existing competitors with respect to the representation of high-dimensional data, in two ways. It introduces 1) a specific weighting of distances between data taking into account the concentration of measure phenomenon and 2) a symmetric handling of short distances in the original and output spaces, avoiding false neighbor representations while still allowing some necessary tears in the original distribution. More precisely, the weighting is set according to the effective distribution of distances in the data set, with the exception of a single user-defined parameter setting the tradeoff between local neighborhood preservation and global mapping. The optimization of the stress criterion designed for the mapping is realized by "force-directed placement" (FDP). The mappings of low- and high-dimensional data sets are presented as illustrations of the features and advantages of the proposed algorithm. The weighting function specific to. high-dimensional data And the symmetric handling of short distances can be easily incorporated in most distance preservation-based nonlinear dimensionality reduction methods.
引用
收藏
页码:1265 / 1279
页数:15
相关论文
共 65 条
[1]  
Aggarwal CC, 2001, LECT NOTES COMPUT SC, V1973, P420
[2]  
[Anonymous], 2004, KERNEL METHODS PATTE
[3]  
[Anonymous], 1952, Psychometrika
[4]  
[Anonymous], LEARING KERNELS SUPP
[5]  
[Anonymous], 2004, P 21 INT C MACH LEAR
[6]  
[Anonymous], 2002, ADV NEURAL INFORM PR
[7]  
[Anonymous], 2006, P 23 INT C MACHINE L, DOI DOI 10.1145/1143844.1143909
[8]  
[Anonymous], 1979, Multivariate analysis
[9]  
[Anonymous], SELF ORGANIZING MAPS
[10]  
[Anonymous], 2005, International Symposium on Applied Stochastic Models and Data Analysis, P238