Algorithm
|
Accuracy
|
Specificity
|
Precision
|
Recall
|
F-measure
|
AUC
|
---|
LR
|
0.901
|
0.970
|
0.685
|
0.459
|
0.623
|
0.912
|
CT
|
0.899
|
0.966
|
0.647
|
0.438
|
0.603
|
0.856
|
JRip
|
0.899
|
0.962
|
0.638
|
0.462
|
0.624
|
0.730
|
BN
|
0.894
|
0.955
|
0.603
|
0.469
|
0.630
|
0.915
|
NN
|
0.889
|
0.952
|
0.576
|
0.451
|
0.612
|
0.890
|
SMO
|
0.901
|
0.978
|
0.710
|
0.366
|
0.533
|
0.672
|
ADABOOST
|
0.892
|
0.971
|
0.630
|
0.337
|
0.500
|
0.891
|
BAGGING
|
0.902
|
0.968
|
0.668
|
0.444
|
0.609
|
0.910
|
RFOREST
|
0.901
|
0.964
|
0.650
|
0.460
|
0.623
|
0.905
|
- LR Logistic regression model, CT Classification tree, JRip Repeated Incremental Pruning to Produce Error Reduction, BN Bayesian network, NN neural network, SMO Sequential Minimal Optimization, ADABOOST Adaptive boosting, BAGGING Bootstrap aggregating, RFOREST Random forest, AUC Area under ROC curve. In bold values with statistically significant differences