spark RDD join Error

2014-09-04 Thread Veeranagouda Mukkanagoudar
I am planning to use RDD join operation, to test out i was trying to
compile some test code, but am getting following compilation error

*value join is not a member of org.apache.spark.rdd.RDD[(String, Int)]*
*[error] rddA.join(rddB).map { case (k, (a, b)) = (k, a+b) }*

Code:

import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.rdd.RDD

def joinTest(rddA: RDD[(String, Int)], rddB: RDD[(String, Int)]) :
RDD[(String, Int)] = {
rddA.join(rddB).map { case (k, (a, b)) = (k, a+b) }
}

Any help would be great .

Veera


Re: spark RDD join Error

2014-09-04 Thread Veeranagouda Mukkanagoudar
Thanks a lot, that fixed the issue :)


On Thu, Sep 4, 2014 at 4:51 PM, Zhan Zhang zzh...@hortonworks.com wrote:

 Try this:
 Import org.apache.spark.SparkContext._

 Thanks.

 Zhan Zhang


 On Sep 4, 2014, at 4:36 PM, Veeranagouda Mukkanagoudar veera...@gmail.com
 wrote:

 I am planning to use RDD join operation, to test out i was trying to
 compile some test code, but am getting following compilation error

 *value join is not a member of org.apache.spark.rdd.RDD[(String, Int)]*
 *[error] rddA.join(rddB).map { case (k, (a, b)) = (k, a+b) }*

 Code:

 import org.apache.spark.{SparkConf, SparkContext}
 import org.apache.spark.rdd.RDD

 def joinTest(rddA: RDD[(String, Int)], rddB: RDD[(String, Int)]) :
 RDD[(String, Int)] = {
 rddA.join(rddB).map { case (k, (a, b)) = (k, a+b) }
 }

 Any help would be great .

 Veera



 CONFIDENTIALITY NOTICE
 NOTICE: This message is intended for the use of the individual or entity
 to which it is addressed and may contain information that is confidential,
 privileged and exempt from disclosure under applicable law. If the reader
 of this message is not the intended recipient, you are hereby notified that
 any printing, copying, dissemination, distribution, disclosure or
 forwarding of this communication is strictly prohibited. If you have
 received this communication in error, please contact the sender immediately
 and delete it from your system. Thank You.


Re: confirm subscribe to user@spark.apache.org

2014-07-11 Thread Veeranagouda Mukkanagoudar
 -0700
 (PDT)
 Date: Fri, 11 Jul 2014 15:10:57 -0700
 Message-ID: 
 cafep6g8gs80d_vmbxd9ghrvbolbrab1pbu9k7updlm2v8n8...@mail.gmail.com
 Subject: please grant me subscriber access
 From: Veeranagouda Mukkanagoudar veera...@gmail.com
 To: user-subscr...@spark.apache.org
 Content-Type: multipart/alternative; boundary=089e010d86283835d004fdf23707
 X-Virus-Checked: Checked by ClamAV on apache.org