Hi Eyal,

what you're seeing is not a Spark issue, it is related to boxed types.

I assume 'b' in your code is some kind of java buffer, where b.getDouble()
returns an instance of java.lang.Double and not a scala.Double. Hence
muCouch is an Array[java.lang.Double], an array containing boxed doubles.

To fix your problem, change 'yield b.getDouble(i)' to 'yield
b.getDouble(i).doubleValue'

You might want to have a look at these too:
-
http://stackoverflow.com/questions/23821576/efficient-conversion-of-java-util-listjava-lang-double-to-scala-listdouble
- https://docs.oracle.com/javase/7/docs/api/java/lang/Double.html
- http://www.scala-lang.org/api/current/index.html#scala.Double

On 30 November 2015 at 10:13, Eyal Sharon <e...@scene53.com> wrote:

> Hi ,
>
> I have problem with inferring what are the types bug here
>
> I have this code fragment . it parse Json to Array[Double]
>
>
>
>
>
>
> *val muCouch = {  val e = input.filter( _.id=="mu")(0).content()  val b  = 
> e.getArray("feature_mean")  for (i <- 0 to e.getInt("features") ) yield 
> b.getDouble(i)}.toArray*
>
> Now the problem is when I want to create a dense vector  :
>
> *new DenseVector(muCouch)*
>
>
> I get the following error :
>
>
> *Error:(111, 21) type mismatch;
>  found   : Array[java.lang.Double]
>  required: Array[scala.Double] *
>
>
> Now , I probably get a workaround for that , but I want to get a deeper 
> understanding  on why it occurs
>
> p.s - I do use collection.JavaConversions._
>
> Thanks !
>
>
>
>
> *This email and any files transmitted with it are confidential and
> intended solely for the use of the individual or entity to whom they are
> addressed. Please note that any disclosure, copying or distribution of the
> content of this information is strictly forbidden. If you have received
> this email message in error, please destroy it immediately and notify its
> sender.*
>

Reply via email to