object LogLoss extends ClassificationLoss
:: DeveloperApi :: Class for log loss calculation (for classification). This uses twice the binomial negative log likelihood, called "deviance" in Friedman (1999).
The log loss is defined as: 2 log(1 + exp(-2 y F(x))) where y is a label in {-1, 1} and F(x) is the model prediction for features x.
- Annotations
- @Since( "1.2.0" ) @DeveloperApi()
- Alphabetic
- By Inheritance
- LogLoss
- ClassificationLoss
- Loss
- Serializable
- Serializable
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[java.lang]
- Definition Classes
- AnyRef
- Annotations
- @native() @throws( ... )
-
def
computeError(model: TreeEnsembleModel, data: RDD[LabeledPoint]): Double
Method to calculate error of the base learner for the gradient boosting calculation.
Method to calculate error of the base learner for the gradient boosting calculation.
- model
Model of the weak learner.
- data
Training dataset: RDD of org.apache.spark.mllib.regression.LabeledPoint.
- returns
Measure of model error on data
- Definition Classes
- Loss
- Annotations
- @Since( "1.2.0" )
- Note
This method is not used by the gradient boosting algorithm but is useful for debugging purposes.
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
finalize(): Unit
- Attributes
- protected[java.lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
gradient(prediction: Double, label: Double): Double
Method to calculate the loss gradients for the gradient boosting calculation for binary classification The gradient with respect to F(x) is: - 4 y / (1 + exp(2 y F(x)))
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @throws( ... )