Spam detection using Random Boost

被引:31
作者
DeBarr, Dave [1 ]
Wechsler, Harry [1 ]
机构
[1] George Mason Univ, Dept Comp Sci, Fairfax, VA 22030 USA
关键词
Spam detection; Robust learning; Random Boost; Random projection; Logit Boost; Random Forest;
D O I
10.1016/j.patrec.2012.03.012
中图分类号
TP18 [人工智能理论];
学科分类号
140502 [人工智能];
摘要
This paper proposes two alternative methods of random projections and compares their performance for robust and efficient spam detection when trained using a small number of examples. Robustness refers to learning and adaptation leading to a high level of performance despite data variability, while efficiency is concerned with (i) the complexity of the detection method employed; and (ii) the amount of training resources used for training and retraining. The first method, Random Project, employs a random projection matrix to produce linear combinations of input features, while the second method, Random Boost, employs random feature selection to enhance the performance of the Logit Boost algorithm. Random Boost is, in fact, a combination of Logit Boost and Random Forest. Experimental results, using TREC and CEAS as challenging spam benchmark sets, show that the Random Boost method significantly improves the performance of the spam filter compared to the Logit Boost algorithm (e.g., a 5% increase in AUC, which is the area under the Receiver Operating Characteristic curve), and yields similar classification accuracy compared to the Random Forest method but using only one fourth the runtime complexity of the Random Forest algorithm. Additionally, the Random Boost algorithm also reduces training time by two orders of magnitude compared to Logit Boost, which becomes important during retraining on the ever changing data streams, including adapting to adversarial tactics and "noise" injected by spammers. (C) 2012 Elsevier B.V. All rights reserved.
引用
收藏
页码:1237 / 1244
页数:8
相关论文
共 14 条
[1]
[Anonymous], 2008, C EM ANT CEAS LIV SP
[2]
Bingham E, 2001, P 7 ACM SIGKDD INT C, P245, DOI DOI 10.1145/502512.502546
[3]
Blum A, 2006, LECT NOTES COMPUT SC, V3940, P52
[4]
Breiman L., 2003, MACHINE LEARN, V1, P5
[5]
Chinavle Deepak., 2009, Proceedings of the 18th ACM Conference on Information and Knowledge Management, CIKM '09, P2015
[6]
Cormack G. V., 2007, NIST special publication, V500-274, P1
[7]
CORTES C, 2004, ADV NEURAL INFORM PR, V17, P305
[8]
An elementary proof of a theorem of Johnson and Lindenstrauss [J].
Dasgupta, S ;
Gupta, A .
RANDOM STRUCTURES & ALGORITHMS, 2003, 22 (01) :60-65
[9]
DeBarr D., 2009, P 6 C EM ANT CEAS
[10]
Additive logistic regression: A statistical view of boosting - Rejoinder [J].
Friedman, J ;
Hastie, T ;
Tibshirani, R .
ANNALS OF STATISTICS, 2000, 28 (02) :400-407