Packages

object LogisticRegressionWithSGD extends Serializable

Top-level methods for calling Logistic Regression using Stochastic Gradient Descent.

Annotations
@Since( "0.8.0" ) @deprecated
Deprecated

(Since version 2.0.0) Use ml.classification.LogisticRegression or LogisticRegressionWithLBFGS

Note

Labels used in Logistic Regression should be {0, 1}

Linear Supertypes
Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. LogisticRegressionWithSGD
  2. Serializable
  3. Serializable
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @native() @throws( ... )
  6. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  7. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  8. def finalize(): Unit
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  9. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  10. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  11. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  12. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  13. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  14. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  15. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  16. def toString(): String
    Definition Classes
    AnyRef → Any
  17. def train(input: RDD[LabeledPoint], numIterations: Int): LogisticRegressionModel

    Train a logistic regression model given an RDD of (label, features) pairs.

    Train a logistic regression model given an RDD of (label, features) pairs. We run a fixed number of iterations of gradient descent using a step size of 1.0. We use the entire data set to update the gradient in each iteration.

    input

    RDD of (label, array of features) pairs.

    numIterations

    Number of iterations of gradient descent to run.

    returns

    a LogisticRegressionModel which has the weights and offset from training.

    Annotations
    @Since( "1.0.0" )
    Note

    Labels used in Logistic Regression should be {0, 1}

  18. def train(input: RDD[LabeledPoint], numIterations: Int, stepSize: Double): LogisticRegressionModel

    Train a logistic regression model given an RDD of (label, features) pairs.

    Train a logistic regression model given an RDD of (label, features) pairs. We run a fixed number of iterations of gradient descent using the specified step size. We use the entire data set to update the gradient in each iteration.

    input

    RDD of (label, array of features) pairs.

    numIterations

    Number of iterations of gradient descent to run.

    stepSize

    Step size to be used for each iteration of Gradient Descent.

    returns

    a LogisticRegressionModel which has the weights and offset from training.

    Annotations
    @Since( "1.0.0" )
    Note

    Labels used in Logistic Regression should be {0, 1}

  19. def train(input: RDD[LabeledPoint], numIterations: Int, stepSize: Double, miniBatchFraction: Double): LogisticRegressionModel

    Train a logistic regression model given an RDD of (label, features) pairs.

    Train a logistic regression model given an RDD of (label, features) pairs. We run a fixed number of iterations of gradient descent using the specified step size. Each iteration uses miniBatchFraction fraction of the data to calculate the gradient.

    input

    RDD of (label, array of features) pairs.

    numIterations

    Number of iterations of gradient descent to run.

    stepSize

    Step size to be used for each iteration of gradient descent.

    miniBatchFraction

    Fraction of data to be used per iteration.

    Annotations
    @Since( "1.0.0" )
    Note

    Labels used in Logistic Regression should be {0, 1}

  20. def train(input: RDD[LabeledPoint], numIterations: Int, stepSize: Double, miniBatchFraction: Double, initialWeights: Vector): LogisticRegressionModel

    Train a logistic regression model given an RDD of (label, features) pairs.

    Train a logistic regression model given an RDD of (label, features) pairs. We run a fixed number of iterations of gradient descent using the specified step size. Each iteration uses miniBatchFraction fraction of the data to calculate the gradient. The weights used in gradient descent are initialized using the initial weights provided.

    input

    RDD of (label, array of features) pairs.

    numIterations

    Number of iterations of gradient descent to run.

    stepSize

    Step size to be used for each iteration of gradient descent.

    miniBatchFraction

    Fraction of data to be used per iteration.

    initialWeights

    Initial set of weights to be used. Array should be equal in size to the number of features in the data.

    Annotations
    @Since( "1.0.0" )
    Note

    Labels used in Logistic Regression should be {0, 1}

  21. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  22. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped