TY - CHAP
T1 - Decision Trees and Random Forests
AU - Nandi, Asoke
AU - Ahmed, Hosameldin
PY - 2019/12/6
Y1 - 2019/12/6
N2 - The literature on classification algorithms has highlighted various techniques to deal with classification problems, e.g. k-nearest neighbours; hierarchical-based models, decision trees (DTs), and random forests (RFs); probability-based models, including naive Bayes classification and logistic regression classification; support vector machines; layered models. This chapter introduces DT and RF classifiers by giving the basic theory of the DT diagnosis tool, its data structure, the ensemble model that combines DTs into a decision Forest model, and their applications in machine fault diagnosis. The algorithms most commonly used by DTs to make splitting decisions are described. These include univariate splitting criteria, e.g. Gini index, information gain, distance measure, and orthogonal criterion; and multivariate splitting criteria, e.g. greedy learning, linear programming, linear discriminant analysis, perceptron learning, and hill-climbing search. The chapter also presents DT pruning methods including error-complexity pruning, minimum-error pruning, and critical-value pruning.
AB - The literature on classification algorithms has highlighted various techniques to deal with classification problems, e.g. k-nearest neighbours; hierarchical-based models, decision trees (DTs), and random forests (RFs); probability-based models, including naive Bayes classification and logistic regression classification; support vector machines; layered models. This chapter introduces DT and RF classifiers by giving the basic theory of the DT diagnosis tool, its data structure, the ensemble model that combines DTs into a decision Forest model, and their applications in machine fault diagnosis. The algorithms most commonly used by DTs to make splitting decisions are described. These include univariate splitting criteria, e.g. Gini index, information gain, distance measure, and orthogonal criterion; and multivariate splitting criteria, e.g. greedy learning, linear programming, linear discriminant analysis, perceptron learning, and hill-climbing search. The chapter also presents DT pruning methods including error-complexity pruning, minimum-error pruning, and critical-value pruning.
UR - https://onlinelibrary.wiley.com/doi/10.1002/9781119544678.ch10
U2 - 10.1002/9781119544678.ch10
DO - 10.1002/9781119544678.ch10
M3 - Chapter
SN - 9781119544623
SN - 9781119544678
SP - 199
EP - 224
BT - Condition Monitoring with Vibration Signals
ER -