Hello,

I have an issue with the cartesian method. When I use it with the Java types
everything is ok, but when I use it with RDD made of objects defined by me
it has very strage behaviors which depends on whether the RDD is cached or
not (you can see  here
<http://stackoverflow.com/questions/28727823/creating-a-matrix-of-neighbors-with-spark-cartesian-issue>
  
what happens).

Is this due to a bug in its implementation or are there any requirements for
the objects to be passed to it?
Thanks.
Best regards.
Marco



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Cartesian-issue-with-user-defined-objects-tp21826.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to