Can you give us a bit more information ?
Such as release of Spark you're using, version of Scala, etc.

Thanks

On Tue, May 5, 2015 at 6:37 PM, xweb <ashish8...@gmail.com> wrote:

> I am getting on following code
> Error:(164, 25) *overloaded method constructor Strategy with alternatives:*
>   (algo: org.apache.spark.mllib.tree.configuration.Algo.Algo,impurity:
> org.apache.spark.mllib.tree.impurity.Impurity,maxDepth: Int,numClasses:
> Int,maxBins: Int,categoricalFeaturesInfo:
>
> java.util.Map[Integer,Integer])org.apache.spark.mllib.tree.configuration.Strategy
> <and>
>   (algo: org.apache.spark.mllib.tree.configuration.Algo.Algo,impurity:
> org.apache.spark.mllib.tree.impurity.Impurity,maxDepth: Int,numClasses:
> Int,maxBins: Int,quantileCalculationStrategy:
>
> org.apache.spark.mllib.tree.configuration.QuantileStrategy.QuantileStrategy,categoricalFeaturesInfo:
> scala.collection.immutable.Map[Int,Int],minInstancesPerNode:
> Int,minInfoGain: Double,maxMemoryInMB: Int,subsamplingRate:
> Double,useNodeIdCache: Boolean,checkpointInterval:
> Int)org.apache.spark.mllib.tree.configuration.Strategy
>  cannot be applied to
> (org.apache.spark.mllib.tree.configuration.Algo.Value,
> org.apache.spark.mllib.tree.impurity.Gini.type, Int, Int, Int,
> scala.collection.immutable.Map[Int,Int])
>     val dTreeStrategy = new Strategy(algo, impurity, maxDepth, numClasses,
> maxBins, categoricalFeaturesInfo)
>                         ^
> <code>
> val categoricalFeaturesInfo = Map[Int, Int]()
>     val impurity = Gini
>     val maxDepth = 4
>     val maxBins = 32
>     val algo = Algo.Classification
>
>     val numClasses = 7
>
>     val dTreeStrategy = new Strategy(algo, impurity, maxDepth, numClasses,
> maxBins, categoricalFeaturesInfo)
> </code>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/overloaded-method-constructor-Strategy-with-alternatives-tp22777.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to