Hi ,
I have problem with inferring what are the types bug here
I have this code fragment . it parse Json to Array[Double]
*val muCouch = { val e = input.filter( _.id=="mu")(0).content() val
b = e.getArray("feature_mean") for (i <- 0 to e.getInt("features") )
yield b.getDouble(i)}.toArra
Hi Eyal,
what you're seeing is not a Spark issue, it is related to boxed types.
I assume 'b' in your code is some kind of java buffer, where b.getDouble()
returns an instance of java.lang.Double and not a scala.Double. Hence
muCouch is an Array[java.lang.Double], an array containing boxed doubles
Great, That works perfect !!
Also tnx for the links - very helpful
On Tue, Dec 1, 2015 at 12:13 AM, Jakob Odersky wrote:
> Hi Eyal,
>
> what you're seeing is not a Spark issue, it is related to boxed types.
>
> I assume 'b' in your code is some kind of java buffer, where b.getDouble()
> returns