unsubscribe

2018-08-08 Thread Tarun Kumar



Unsubscribe

2018-06-22 Thread Tarun Kumar
Unsubscribe


Re: Can I add a new method to RDD class?

2016-12-05 Thread Tarun Kumar
Teng,

Can you please share the details of transformation that you want to
implement in your method foo?

I have created a gist of one dummy transformation for your method foo ,
this foo method transforms from an RDD[T] to RDD[(T,T)]. Many such more
transformations can easily be achieved.

https://gist.github.com/fidato13/3b46fe1c96b37ae0dd80c275fbe90e92

Thanks
Tarun Kumar

On 5 December 2016 at 22:33, Thakrar, Jayesh <jthak...@conversantmedia.com>
wrote:

> Teng,
>
>
>
> Before you go down creating your own custom Spark system, do give some
> thought to what Holden and others are suggesting, viz. using implicit
> methods.
>
>
>
> If you want real concrete examples, have a look at the Spark Cassandra
> Connector -
>
>
>
> Here you will see an example of "extending" SparkContext -
> https://github.com/datastax/spark-cassandra-connector/
> blob/master/doc/2_loading.md
>
>
>
> // validation is deferred, so it is not triggered during rdd creation
>
> val rdd = sc.cassandraTable[SomeType]("ks", "not_existing_table")
>
> val emptyRDD = rdd.toEmptyCassandraRDD
>
>
>
> val emptyRDD2 = sc.emptyCassandraTable[SomeType]("ks",
> "not_existing_table"))
>
>
>
>
>
> And here you will se an example of "extending" RDD -
> https://github.com/datastax/spark-cassandra-connector/
> blob/master/doc/5_saving.md
>
>
>
> case class WordCount(word: String, count: Long)
>
> val collection = sc.parallelize(Seq(WordCount("dog", 50),
> WordCount("cow", 60)))
>
> collection.saveToCassandra("test", "words", SomeColumns("word", "count"))
>
>
>
> Hope that helps…
>
> Jayesh
>
>
>
>
>
> *From: *Teng Long <longteng...@gmail.com>
> *Date: *Monday, December 5, 2016 at 3:04 PM
> *To: *Holden Karau <hol...@pigscanfly.ca>, <dev@spark.apache.org>
> *Subject: *Re: Can I add a new method to RDD class?
>
>
>
> Thank you for providing another answer, Holden.
>
>
>
> So I did what Tarun and Michal suggested, and it didn’t work out as I want
> to have a new transformation method in RDD class, and need to use that
> RDD’s spark context which is private. So I guess the only thing I can do
> now is to sbt publishLocal?
>
>
>
> On Dec 5, 2016, at 9:19 AM, Holden Karau <hol...@pigscanfly.ca> wrote:
>
>
>
> Doing that requires publishing a custom version of Spark, you can edit the
> version number do do a publishLocal - but maintaining that change is going
> to be difficult. The other approaches suggested are probably better, but
> also does your method need to be defined on the RDD class? Could you
> instead make a helper object or class to expose whatever functionality you
> need?
>
>
>
> On Mon, Dec 5, 2016 at 6:06 PM long <longteng...@gmail.com> wrote:
>
> Thank you very much! But why can’t I just add new methods in to the source
> code of RDD?
>
>
>
> On Dec 5, 2016, at 3:15 AM, Michal Šenkýř [via Apache Spark Developers
> List] <[hidden email] <http://user/SendEmail.jtp?type=node=20107=0>>
> wrote:
>
>
>
> A simple Scala example of implicit classes:
>
> implicit class EnhancedString(str: String) {
>
>   def prefix(prefix: String) = prefix + str
>
> }
>
>
>
> println("World".prefix("Hello "))
>
> As Tarun said, you have to import it if it's not in the same class where
> you use it.
>
> Hope this makes it clearer,
>
> Michal Senkyr
>
>
>
> On 5.12.2016 07:43, Tarun Kumar wrote:
>
> Not sure if that's documented in terms of Spark but this is a fairly
> common pattern in scala known as "pimp my library" pattern, you can easily
> find many generic example of using this pattern. If you want I can quickly
> cook up a short conplete example with rdd(although there is nothing really
> more to my example in earlier mail) ? Thanks Tarun Kumar
>
>
>
> On Mon, 5 Dec 2016 at 7:15 AM, long < rel="nofollow" link="external" class="">[hidden email]> wrote:
>
> So is there documentation of this I can refer to?
>
>
>
> On Dec 5, 2016, at 1:07 AM, Tarun Kumar [via Apache Spark Developers List]
> <[hidden email] <http://user/SendEmail.jtp?type=node=20104=0>>
> wrote:
>
>
>
> Hi Tenglong, In addition to trsell's reply, you can add any method to an
> rdd without making changes to spark code. This can be achieved by using
> implicit class in your own client code: implicit class extendRDD[T](rdd:
> RDD[T]){ def foo() } Then you basically nees to import this implicit class
> in scope 

Re: Can I add a new method to RDD class?

2016-12-04 Thread Tarun Kumar
Not sure if that's documented in terms of Spark but this is a fairly common
pattern in scala known as "pimp my library" pattern, you can easily find
many generic example of using this pattern.

If you want I can quickly cook up a short conplete example with
rdd(although there is nothing really more to my example in earlier mail) ?

Thanks
Tarun Kumar

On Mon, 5 Dec 2016 at 7:15 AM, long <longteng...@gmail.com> wrote:

> So is there documentation of this I can refer to?
>
> On Dec 5, 2016, at 1:07 AM, Tarun Kumar [via Apache Spark Developers List]
> <[hidden email] <http:///user/SendEmail.jtp?type=node=20104=0>>
> wrote:
>
> Hi Tenglong,
>
> In addition to trsell's reply, you can add any method to an rdd without
> making changes to spark code.
>
> This can be achieved by using implicit class in your own client code:
>
> implicit class extendRDD[T](rdd: RDD[T]){
>
> def foo()
>
> }
>
> Then you basically nees to import this implicit class in scope where you
> want to use the new foo method.
>
> Thanks
> Tarun Kumar
>
> On Mon, 5 Dec 2016 at 6:59 AM, < href="x-msg://19/user/SendEmail.jtp?type=nodenode=20102i=0"
> target="_top" rel="nofollow" link="external" class="">[hidden email]> wrote:
>
> How does your application fetch the spark dependency? Perhaps list your
> project dependencies and check it's using your dev build.
>
> On Mon, 5 Dec 2016, 08:47 tenglong, < href="x-msg://19/user/SendEmail.jtp?type=nodenode=20102i=1"
> target="_top" rel="nofollow" link="external" class="">[hidden email]> wrote:
>
> Hi,
>
> Apparently, I've already tried adding a new method to RDD,
>
> for example,
>
> class RDD {
>   def foo() // this is the one I added
>
>   def map()
>
>   def collect()
> }
>
> I can build Spark successfully, but I can't compile my application code
> which calls rdd.foo(), and the error message says
>
> value foo is not a member of org.apache.spark.rdd.RDD[String]
>
> So I am wondering if there is any mechanism prevents me from doing this or
> something I'm doing wrong?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Can-I-add-a-new-method-to-RDD-class-tp20100.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> -
>
> To unsubscribe e-mail:  href="x-msg://19/user/SendEmail.jtp?type=nodenode=20102i=2"
> target="_top" rel="nofollow" link="external" class="">[hidden email]
>
>
>
> --
> If you reply to this email, your message will be added to the discussion
> below:
>
> http://apache-spark-developers-list.1001551.n3.nabble.com/Can-I-add-a-new-method-to-RDD-class-tp20100p20102.html
> To unsubscribe from Can I add a new method to RDD class?, click here.
> NAML
> <http://apache-spark-developers-list.1001551.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer=instant_html%21nabble%3Aemail.naml=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>
>
>
>
> --
> View this message in context: Re: Can I add a new method to RDD class?
> <http://apache-spark-developers-list.1001551.n3.nabble.com/Can-I-add-a-new-method-to-RDD-class-tp20100p20104.html>
> Sent from the Apache Spark Developers List mailing list archive
> <http://apache-spark-developers-list.1001551.n3.nabble.com/> at
> Nabble.com.
>


Re: Can I add a new method to RDD class?

2016-12-04 Thread Tarun Kumar
Hi Tenglong,

In addition to trsell's reply, you can add any method to an rdd without
making changes to spark code.

This can be achieved by using implicit class in your own client code:

implicit class extendRDD[T](rdd: RDD[T]){

def foo()

}

Then you basically nees to import this implicit class in scope where you
want to use the new foo method.

Thanks
Tarun Kumar

On Mon, 5 Dec 2016 at 6:59 AM, <trs...@gmail.com> wrote:

> How does your application fetch the spark dependency? Perhaps list your
> project dependencies and check it's using your dev build.
>
> On Mon, 5 Dec 2016, 08:47 tenglong, <longteng...@gmail.com> wrote:
>
> Hi,
>
> Apparently, I've already tried adding a new method to RDD,
>
> for example,
>
> class RDD {
>   def foo() // this is the one I added
>
>   def map()
>
>   def collect()
> }
>
> I can build Spark successfully, but I can't compile my application code
> which calls rdd.foo(), and the error message says
>
> value foo is not a member of org.apache.spark.rdd.RDD[String]
>
> So I am wondering if there is any mechanism prevents me from doing this or
> something I'm doing wrong?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Can-I-add-a-new-method-to-RDD-class-tp20100.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: welcoming Xiao Li as a committer

2016-10-04 Thread Tarun Kumar
Congrats Xiao.

Thanks
Tarun
On Tue, 4 Oct 2016 at 12:57 PM, Cheng Lian  wrote:

> Congratulations!!!
>
>
> Cheng
>
> On Tue, Oct 4, 2016 at 1:46 PM, Reynold Xin  wrote:
>
> Hi all,
>
> Xiao Li, aka gatorsmile, has recently been elected as an Apache Spark
> committer. Xiao has been a super active contributor to Spark SQL. Congrats
> and welcome, Xiao!
>
> - Reynold
>
>
>


Re: Welcoming Felix Cheung as a committer

2016-08-08 Thread Tarun Kumar
Congrats Felix!

Tarun
On Tue, Aug 9, 2016 at 12:57 AM, Timothy Chen  wrote:

> Congrats Felix!
>
> Tim
>
> On Mon, Aug 8, 2016 at 11:15 AM, Matei Zaharia 
> wrote:
> > Hi all,
> >
> > The PMC recently voted to add Felix Cheung as a committer. Felix has
> been a major contributor to SparkR and we're excited to have him join
> officially. Congrats and welcome, Felix!
> >
> > Matei
> > -
> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>