Could you also print the length of featureSet? I suspect it less than 62.
The first argument of Vectors.sparse() is the length of this sparse vector
not the length of non-null elements.

Yanbo

2015-12-03 22:30 GMT+08:00 nabegh <nab...@gmail.com>:

> I'm trying to run a SVM classifier on unlabeled data. I followed  this
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/MLLib-sparse-vector-td14273.html
> >
> to build the vectors and checked  this
> <
> https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/mllib/linalg/Vectors.scala#L313
> >
>
> Now when I make the call to predict, I receive the following error. Any
> hints?
>
>     val v = featureRDD.map(f => Vectors.sparse(featureSet.length, f))
> //length = 63
>     val predictions = model.predict(v)
>     println(s"predictions length  = ${predictions.collect.length}")
>
>
> Exception in thread "main" org.apache.spark.SparkException: Job aborted due
> to stage failure: Task 1 in stage 121.0 failed 4 times, most recent
> failure:
> Lost task 1.3 in stage 121.0 (TID 233, 10.1.1.63):
> java.lang.ArrayIndexOutOfBoundsException: 62
>         at
>
> breeze.linalg.operators.DenseVector_SparseVector_Ops$$anon$98.apply(SparseVectorOps.scala:297)
>         at
>
> breeze.linalg.operators.DenseVector_SparseVector_Ops$$anon$98.apply(SparseVectorOps.scala:282)
>         at
> breeze.linalg.operators.BinaryRegistry$class.apply(BinaryOp.scala:60)
>         at breeze.linalg.VectorOps$$anon$171.apply(Vector.scala:528)
>         at breeze.linalg.ImmutableNumericOps$class.dot(NumericOps.scala:98)
>         at breeze.linalg.DenseVector.dot(DenseVector.scala:50)
>         at
> org.apache.spark.mllib.classification.SVMModel.predictPoint(SVM.scala:81)
>         at
>
> org.apache.spark.mllib.regression.GeneralizedLinearModel$$anonfun$predict$1$$anonfun$apply$1.apply(GeneralizedLinearAlgorithm.scala:71)
>         at
>
> org.apache.spark.mllib.regression.GeneralizedLinearModel$$anonfun$predict$1$$anonfun$apply$1.apply(GeneralizedLinearAlgorithm.scala:71)
>         at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
>         at scala.collection.Iterator$class.foreach(Iterator.scala:727)
>         at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
>         at
> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
>         at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
>         at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
>         at
> scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
>         at scala.collection.AbstractIterator.to(Iterator.scala:1157)
>         at
> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
>         at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
>         at
> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
>         at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
>         at
>
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:909)
>         at
>
> org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:909)
>         at
>
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850)
>         at
>
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850)
>         at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
>         at org.apache.spark.scheduler.Task.run(Task.scala:88)
>         at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
>         at
>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         at java.lang.Thread.run(Thread.java:745)
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Sparse-Vector-ArrayIndexOutOfBoundsException-tp25556.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to