I'm not sure what you're trying there. The return type is an RDD of
arrays, not of RDDs or of ArrayLists. There may be another catch but
that is not it.

On Fri, May 13, 2016 at 11:50 AM, Tom Godden <tgod...@vub.ac.be> wrote:
> I believe it's an illegal cast. This is the line of code:
>> RDD<RDD<ArrayList<Integer>>> windowed =
>> RDDFunctions.fromRDD(vals.rdd(), vals.classTag()).sliding(20, 1);
> with vals being a JavaRDD<ArrayList<Integer>>.  Explicitly casting
> doesn't work either:
>> RDD<RDD<ArrayList<Integer>>> windowed = (RDD<RDD<ArrayList<Integer>>>)
>> RDDFunctions.fromRDD(vals.rdd(), vals.classTag()).sliding(20, 1);
> Did I miss something?
>
> On 13-05-16 09:44, Sean Owen wrote:
>> The problem is there's no Java-friendly version of this, and the Scala
>> API return type actually has no analog in Java (an array of any type,
>> not just of objects) so it becomes Object. You can just cast it to the
>> type you know it will be -- RDD<String[]> or RDD<long[]> or whatever.
>>
>> On Fri, May 13, 2016 at 8:40 AM, tgodden <tgod...@vub.ac.be> wrote:
>>> Hello,
>>>
>>> We're trying to use PrefixSpan on sequential data, by passing a sliding
>>> window over it. Spark Streaming is not an option.
>>> RDDFunctions.sliding() returns an item of class RDD<Java.lang.Object>,
>>> regardless of the original type of the RDD. Because of this, the
>>> returned item seems to be pretty much worthless.
>>> Is this a bug/nyi? Is there a way to circumvent this somehow?
>>>
>>> Official docs:
>>> https://spark.apache.org/docs/1.6.0/api/java/org/apache/spark/mllib/rdd/RDDFunctions.html
>>>
>>> Thanks
>>>
>>> ________________________________
>>> View this message in context: Java: Return type of RDDFunctions.sliding(int,
>>> int)
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to