Pyspark RDD search

2015-06-17 Thread bhavyateja
I have an RDD which is list of list And another RDD which is list of pairs No duplicates in inner list of first RDD and No duplicates in the pairs from second rdd I am trying to check if any pair of second RDD is present in the any list of -- View this message in context:

Pyspark combination

2015-06-17 Thread bhavyateja
I have an RDD with more than 1000 elements. I have to form the combinations of elements. I tried to use Cartesian transformation and then filtering them. But failing with eof error. Is there any other way to do the same using partitions I am using pyspark -- View this message in

Re: spark 1.2 compatibility

2015-01-17 Thread bhavyateja
executed using spark 0.9.1 On Sat, Jan 17, 2015 at 5:55 AM, Chitturi Padma [via Apache Spark User List] ml-node+s1001560n21202...@n3.nabble.com wrote: Yes. I built spar 1.2 with apache hadoop 2.2. No compatibility issues. On Sat, Jan 17, 2015 at 4:47 AM, bhavyateja [via Apache Spark User List

Re: spark 1.2 compatibility

2015-01-17 Thread bhavyateja
Spark User List] ml-node+s1001560n21202...@n3.nabble.com wrote: Yes. I built spar 1.2 with apache hadoop 2.2. No compatibility issues. On Sat, Jan 17, 2015 at 4:47 AM, bhavyateja [via Apache Spark User List] [hidden email] http:///user/SendEmail.jtp?type=nodenode=21202i=0 wrote: Is spark 1.2

Re: spark 1.2 compatibility

2015-01-17 Thread bhavyateja
Yes it works with 2.2 but we are trying to use spark 1.2 on HDP 2.1 On Sat, Jan 17, 2015, 11:18 AM Chitturi Padma [via Apache Spark User List] ml-node+s1001560n21208...@n3.nabble.com wrote: It worked for me. spark 1.2.0 with hadoop 2.2.0 On Sat, Jan 17, 2015 at 9:39 PM, bhavyateja [via

spark 1.2 compatibility

2015-01-16 Thread bhavyateja
Is spark 1.2 is compatibly with HDP 2.1 -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-2-compatibility-tp21197.html Sent from the Apache Spark User List mailing list archive at Nabble.com.