;> >
>>> > On Dec 5, 2016, at 9:12 PM, Jakob Odersky <>> > href="x-msg://50/user/SendEmail.jtp?type=nodenode=20151i=1"
>>> > target="_top" rel="nofollow" link="external" class="">[hidden email]>
>>&g
="nofollow" link="external" class="">[hidden email]> wrote:
> >
> > m rdds in an "org.apache.spark" package as well
> >
> >
> -------------------------
> To unsubscribe e-mail: rel=&q
> > On Dec 5, 2016, at 9:12 PM, Jakob Odersky <[hidden email]
> > > wrote:
> >
> > m rdds in an "org.apache.spark" package as well
> >
> >
>
> -
> To unsubscribe e-mail: [hidden email]
>
>
Yes, I think changing the property (line 29) in spark's root
pom.xml should be sufficient. However, keep in mind that you'll also
need to publish spark locally before you can access it in your test
application.
On Tue, Dec 6, 2016 at 2:50 AM, Teng Long wrote:
> Thank you
Thank you Jokob for clearing things up for me.
Before, I thought my application was compiled against my local build since I
can get all the logs I just added in spark-core. But it was all along using
spark downloaded from remote maven repository, and that’s why I “cannot" add
new RDD methods
It looks like you're having issues with including your custom spark
version (with the extensions) in your test project. To use your local
spark version:
1) make sure it has a custom version (let's call it 2.1.0-CUSTOM)
2) publish it to your local machine with `sbt publishLocal`
3) include the
; And here you will se an example of "extending" RDD -
>> https://github.com/datastax/spark-cassandra-connector/blob/master/doc/5_saving.md
>>
>>
>>
>> case class WordCount(word: String, count: Long)
>>
>> val collection = sc.paralleli
uot;words", SomeColumns("word", "count"))
>
>
>
> Hope that helps…
>
> Jayesh
>
>
>
>
>
> From: Teng Long <longteng...@gmail.com <mailto:longteng...@gmail.com>>
> Date: Monday, December 5, 2016 at 3:04 PM
> To: Holde
.md
>
>
>
> case class WordCount(word: String, count: Long)
>
> val collection = sc.parallelize(Seq(WordCount("dog", 50),
> WordCount("cow", 60)))
>
> collection.saveToCassandra("test", "words", SomeColumns("word", "
adding a new method to RDD,
for example,
class RDD {
def foo() // this is the one I added
def map()
def collect()
}
I can build Spark successfully, but I can't compile my application code
which calls rdd.foo(), and the error message says
value foo is not a member of org.apache.spark.rdd.RDD[String]
rget="_top" rel="nofollow" link="external" class="">[hidden email]>
>>>>> wrote:
>>>>
>>>>> How does your application fetch the spark dependency? Perhaps list your
>>>>> project dependencies
ication fetch the spark dependency? Perhaps list your
>>>> project dependencies and check it's using your dev build.
>>>>
>>>>
>
>>>> On Mon, 5 Dec 2016, 08:47 tenglong, <>>> class="">x-msg://19/user/SendEmail.jtp?type=n
t;
>> target="_top" rel="nofollow" link="external" class="">[hidden email]> wrote:
>>
>> Hi,
>>
>> Apparently, I've already tried adding a new method to RDD,
>>
>> for example,
>&
ts me from doing this or
> something I'm doing wrong?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Can-I-add-a-new-method-to-RDD-class-tp20100.html
> Sent from the Apache Spark Developers List mailing list
endEmail.jtp?type=nodenode=20102i=1"
>>> target="_top" rel="nofollow" link="external" class="">[hidden email]> wrote:
>>
>>> Hi,
>>>
>>> Apparently, I've already tried adding a new method to RDD,
>
wrong?
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Can-I-add-a-new-method-to-RDD-class-tp20100.html
Sent from the Apache Spark Developers List
s rdd.foo(), and the error message says
>
> value foo is not a member of org.apache.spark.rdd.RDD[String]
>
> So I am wondering if there is any mechanism prevents me from doing this or
> something I'm doing wrong?
>
>
>
>
> --
> View this message in context:
> http://ap
his or
> something I'm doing wrong?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Can-I-add-a-new-method-to-RDD-class-tp20100.html
&g
s I added into the spark source
code, so I'm pretty sure the application is using the one I just built.
Thanks!
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Can-I-add-a-new-method-to-RDD-class-tp20100p20103.html
Sent from the Apache Spark
doing wrong?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Can-I-add-a-new-method-to-RDD-class-tp20100.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>
er of org.apache.spark.rdd.RDD[String]
>
> So I am wondering if there is any mechanism prevents me from doing this or
> something I'm doing wrong?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Can-I-add-a-new-
is not a member of org.apache.spark.rdd.RDD[String]
So I am wondering if there is any mechanism prevents me from doing this or
something I'm doing wrong?
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Can-I-add-a-new-method-to-RDD-class-tp20100.html
22 matches
Mail list logo