Stay organized with collections
Save and categorize content based on your preferences.
A decision forest is a generic term to describe models made of multiple
decision trees. The prediction of a decision forest is the aggregation of the
predictions of its decision trees. The implementation of this aggregation
depends on the algorithm used to train the decision forest. For example, in a
multi-class classification random forest (a type of decision forest), each tree
votes for a single class, and the random forest prediction is the most
represented class. In a binary classification gradient boosted Tree (GBT)
(another type of decision forest), each tree outputs a logit (a floating point
value), and the gradient boosted tree prediction is the sum of those values
followed by an activation function (e.g. sigmoid).
The next two chapters detail those two decision forests algorithms.