I am planning to use RDD join operation, to test out i was trying to
compile some test code, but am getting following compilation error
*value join is not a member of org.apache.spark.rdd.RDD[(String, Int)]*
*[error] rddA.join(rddB).map { case (k, (a, b)) = (k, a+b) }*
Code:
import
Thanks a lot, that fixed the issue :)
On Thu, Sep 4, 2014 at 4:51 PM, Zhan Zhang zzh...@hortonworks.com wrote:
Try this:
Import org.apache.spark.SparkContext._
Thanks.
Zhan Zhang
On Sep 4, 2014, at 4:36 PM, Veeranagouda Mukkanagoudar veera...@gmail.com
wrote:
I am planning to use
:
cafep6g8gs80d_vmbxd9ghrvbolbrab1pbu9k7updlm2v8n8...@mail.gmail.com
Subject: please grant me subscriber access
From: Veeranagouda Mukkanagoudar veera...@gmail.com
To: user-subscr...@spark.apache.org
Content-Type: multipart/alternative; boundary=089e010d86283835d004fdf23707
X-Virus-Checked: Checked by ClamAV on apache.org