I corrected the type to RDD, but it's still giving
me the error.
I believe I have found the reason though. The vals variable is created
using the map procedure on some other RDD. Although it is declared as a
JavaRDD, the classTag it returns is Object. I think
that because
The Java docs won't help since they only show "Object", yes. Have a
look at the Scala docs:
https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.mllib.rdd.RDDFunctions
An RDD of T produces an RDD of T[].
On Fri, May 13, 2016 at 12:10 PM, Tom Godden
I assumed the "fixed size blocks" mentioned in the documentation
(https://spark.apache.org/docs/1.6.0/api/java/org/apache/spark/mllib/rdd/RDDFunctions.html#sliding%28int,%20int%29)
were RDDs, but I guess they're arrays? Even when I change the RDD to
arrays (so it looks like RDD), it
I'm not sure what you're trying there. The return type is an RDD of
arrays, not of RDDs or of ArrayLists. There may be another catch but
that is not it.
On Fri, May 13, 2016 at 11:50 AM, Tom Godden wrote:
> I believe it's an illegal cast. This is the line of code:
>>
I believe it's an illegal cast. This is the line of code:
> RDD> windowed =
> RDDFunctions.fromRDD(vals.rdd(), vals.classTag()).sliding(20, 1);
with vals being a JavaRDD. Explicitly casting
doesn't work either:
> RDD> windowed = (RDD>)
>
The problem is there's no Java-friendly version of this, and the Scala
API return type actually has no analog in Java (an array of any type,
not just of objects) so it becomes Object. You can just cast it to the
type you know it will be -- RDD or RDD or whatever.
On Fri, May 13,