package tree
This package contains the default implementation of the decision tree algorithm, which supports:
- binary classification,
- regression,
- information loss calculation with entropy and Gini for classification and variance for regression,
- both continuous and categorical features.
- Alphabetic
- By Inheritance
- tree
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Type Members
-
class
DecisionTree extends Serializable with Logging
A class which implements a decision tree learning algorithm for classification and regression.
A class which implements a decision tree learning algorithm for classification and regression. It supports both continuous and categorical features.
- Annotations
- @Since( "1.0.0" )
-
class
GradientBoostedTrees extends Serializable with Logging
A class that implements Stochastic Gradient Boosting for regression and binary classification.
A class that implements Stochastic Gradient Boosting for regression and binary classification.
The implementation is based upon: J.H. Friedman. "Stochastic Gradient Boosting." 1999.
Notes on Gradient Boosting vs. TreeBoost:
- This implementation is for Stochastic Gradient Boosting, not for TreeBoost.
- Both algorithms learn tree ensembles by minimizing loss functions.
- TreeBoost (Friedman, 1999) additionally modifies the outputs at tree leaf nodes
based on the loss function, whereas the original gradient boosting method does not.
- When the loss is SquaredError, these methods give the same result, but they could differ for other loss functions.
- Annotations
- @Since( "1.2.0" )
Value Members
-
object
DecisionTree extends Serializable with Logging
- Annotations
- @Since( "1.0.0" )
-
object
GradientBoostedTrees extends Logging with Serializable
- Annotations
- @Since( "1.2.0" )
-
object
RandomForest extends Serializable with Logging
- Annotations
- @Since( "1.2.0" )