Re: How to work with a joined rdd in pyspark?

2015-11-30 Thread arnalone
ow-to-work-with-a-joined-rdd-in-pyspark-tp25510p25519.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-ma

Re: How to work with a joined rdd in pyspark?

2015-11-29 Thread Gylfi
is needed to understand what's going on. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-work-with-a-joined-rdd-in-pyspark-tp25510p25511.html Sent from the Apache Spark User List mailing list archive at

Re: How to work with a joined rdd in pyspark?

2015-11-29 Thread arnalone
', 100)), (u'PostModern_Cooking', (u'DEF', 597))] I would like to sum views per show for channel = "ABC" -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-work-with-a-joined-rdd-in-pyspark-tp25510p25512.html Sent from the Apache Spark User Li

Re: How to work with a joined rdd in pyspark?

2015-11-29 Thread Gylfi
t;ABC" ).map( (_._1, _._2._2) .reduceByKey( (a,b) => a + b ) This should give you an RDD of one record per show and the summed view count but only for shows on ABC right? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-work-with-a-joined-r

Re: How to work with a joined rdd in pyspark?

2015-11-29 Thread arnalone
to do the same :( -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-work-with-a-joined-rdd-in-pyspark-tp25510p25516.html Sent from the Apache Spark User List mailing list archive at Nabble.com. -

Re: How to work with a joined rdd in pyspark?

2015-11-29 Thread Gylfi
Can't you just access it by element, like with [0] and [1] ? http://www.tutorialspoint.com/python/python_tuples.htm -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-work-with-a-joined-rdd-in-pyspark-tp25510p25517.html Sent from the Apache Spark

Re: Joined RDD

2015-04-17 Thread Archit Thakur
.n3.nabble.com/Joined-RDD-tp18820p18829.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h

Re: Joined RDD

2014-11-13 Thread Mayur Rustagi
action on C (let say collect), action is served without reading any data from the disk. Since no data is cached in spark how is action on C is served without reading data from disk. Thanks --Ajay -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Joined

Re: Joined RDD

2014-11-13 Thread ajay garg
this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Joined-RDD-tp18820p18829.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr

Joined RDD

2014-11-12 Thread ajay garg
. Since no data is cached in spark how is action on C is served without reading data from disk. Thanks --Ajay -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Joined-RDD-tp18820.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Joined RDD

2014-11-12 Thread qinwei
(let say collect), action is served without reading any data from the disk. Since no data is cached in spark how is action on C is served without reading data from disk.   Thanks --Ajay       -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Joined-RDD-tp18820.html