Skip to main content

Table 1 ML algorithms used in the studies and corresponding featuring studies. (N = 43 studies)

From: Application of machine learning in predicting hospital readmissions: a scoping review of the literature

Type of ML Algorithms Number of Studiesf (Percent) Featuring Studies
Tree-based methods 23 (53%)  
Decision Tree 9 [46, 52, 54,55,56,57,58,59,60]
Random Forest 16 [48,49,50, 59,60,61,62,63,64,65,66,67,68,69,70,71]
Boosted tree methodsa 18 [47, 49,50,51, 53, 54, 59, 64,65,66,67, 71,72,73,74,75,76,77]
Regularized Logistic Regression (penalized method) 12 (28%)  
Lasso (L1 regularization) 9 [53, 64, 65, 67, 70, 71, 78,79,80]
Ridge Regression (L2 regularization) 4 [64, 70, 71, 80]
Elastic-Net 3 [49, 72, 81]
Support Vector Machine 10 (23%) [54, 60, 63, 65, 66, 70, 71, 82,83,84]
Neural Networks 14 (33%)  
NN (with multiple hidden layers, e.g. deep learning)b 10 [60, 69,70,71, 77, 79, 80, 85,86,87]
CNN 3 [60, 71, 80]
RNN 5 [70, 71, 79, 80, 86]
Deep stacking network 1 [69]
Deep neural networks 2 [77, 85]
Ensemble of DL methods 1 [87]
NN (with a single or unclear number of hidden layers, or unclear) 5 [58]d, [60]d, [68]d, [49]e, [66]e
Other algorithms 10 (23%)  
Naïve Bayes network 4 [49, 54, 70, 84]
KNN 2 [54, 65]
Ensemble of methodsc 3 [50, 67, 84]
Bayesian Model Averaging 1 [49]
  1. Abbreviations: ML machine learning, Lasso least absolute shrinkage and selection operator, NN neural networks, CNN convolutional neural network, RNN recurrent neural network, DL deep learning, KNN The k-nearest neighbors. ait includes adaboost, gradient boosting, gradient descent boosting, boosting, XGBoost; bit includes CNN, RNN, DNN, deep stacking networks, and ensemble of DL methods; cDT ensembled with SVM, RF combined with SVM, tree-augmented naïve Bayesian network; done hidden layers; edid not specify number of layers
  2. fSince most studies have applied more than 1 machine learning algorithms, therefore the sum of the number of studies by machine learning method is greater than 43