I have an RDD which is list of list
And another RDD which is list of pairs
No duplicates in inner list of first RDD and
No duplicates in the pairs from second rdd
I am trying to check if any pair of second RDD is present in the any list of
--
View this message in context:
http://apache-spark
I have an RDD with more than 1000 elements. I have to form the combinations
of elements.
I tried to use Cartesian transformation and then filtering them. But
failing with eof error. Is there any other way to do the same using
partitions
I am using pyspark
--
View this message in context:
Yes it works with 2.2 but we are trying to use spark 1.2 on HDP 2.1
On Sat, Jan 17, 2015, 11:18 AM Chitturi Padma [via Apache Spark User List] <
ml-node+s1001560n21208...@n3.nabble.com> wrote:
> It worked for me. spark 1.2.0 with hadoop 2.2.0
>
> On Sat, Jan 17, 2015 at 9:39 PM,
Sat, Jan 17, 2015 at 5:55 AM, Chitturi Padma [via Apache Spark User
> List] wrote:
>
>> Yes. I built spar 1.2 with apache hadoop 2.2. No compatibility issues.
>>
>> On Sat, Jan 17, 2015 at 4:47 AM, bhavyateja [via Apache Spark User List]
>> <[hidden email] <http:/
executed using spark 0.9.1
On Sat, Jan 17, 2015 at 5:55 AM, Chitturi Padma [via Apache Spark User
List] wrote:
> Yes. I built spar 1.2 with apache hadoop 2.2. No compatibility issues.
>
> On Sat, Jan 17, 2015 at 4:47 AM, bhavyateja [via Apache Spark User List]
> <[hidden
> em
Is spark 1.2 is compatibly with HDP 2.1
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-2-compatibility-tp21197.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---