Classifier¶
<auto-generated stub>
Related methods¶
def logreg(data: Rep[TS[Double,:doc:boolean]], stochastic: Rep[Boolean] = true, initLearningRate: Rep[Double] = unit(1.0), maxIter: Rep[Int] = 30, lambda: Rep[Double] = unit(0.0), verbose: Rep[Boolean] = false, callback: (Rep[DenseVector[Double]],Rep[Int]) => Rep[Unit] = (m,i) => unit(()))(implicit ev0: TrainingSetLike[Double,:doc:boolean,TS],ev1: Manifest[TS[Double,:doc:boolean]]): Rep[DenseVector[Double]]
Logistic regression with dense parameters. The training set can be dense or sparse.
def rforest(trainingSet: Rep[DenseTrainingSet[Double,:doc:boolean]], numTrees: Rep[Int] = 10, samplingRate: Rep[Double] = unit(0.66), maxDepth: Rep[Int] = unit(-1), maxNumFeatures: Rep[Int] = unit(-1), minSamplesSplit: Rep[Int] = 2, minSamplesLeaf: Rep[Int] = 1, verbose: Rep[Boolean] = false): Rep[RandomForest]
def sparseLogreg(data: Rep[SparseTrainingSet[Double,:doc:boolean]], stochastic: Rep[Boolean] = true, initLearningRate: Rep[Double] = unit(1.0), maxIter: Rep[Int] = 30, lambda: Rep[Double] = unit(0.0), verbose: Rep[Boolean] = false, callback: (Rep[SparseVector[Double]],Rep[Int]) => Rep[Unit] = (m,i) => unit(())): Rep[SparseVector[Double]]
Logistic regression with sparse parameters. The training set must be sparse. FIXME: With regularization turned on, this is not producing exactly the same results as dense logreg. However, it still seems to work, more or less.