Re: Welcoming Yanbo Liang as a committer

2016-06-03 Thread Bhupendra Mishra
congratulations Yanbo!


On Sat, Jun 4, 2016 at 9:08 AM, Dongjoon Hyun  wrote:

> Wow, Congratulations, Yanbo!
>
> Dongjoon.
>
> On Fri, Jun 3, 2016 at 8:22 PM, Xiao Li  wrote:
>
>> Congratulations, Yanbo!
>>
>> 2016-06-03 19:54 GMT-07:00 Nan Zhu :
>>
>>> Congratulations !
>>>
>>> --
>>> Nan Zhu
>>>
>>> On June 3, 2016 at 10:50:33 PM, Ted Yu (yuzhih...@gmail.com) wrote:
>>>
>>> Congratulations, Yanbo.
>>>
>>> On Fri, Jun 3, 2016 at 7:48 PM, Matei Zaharia 
>>> wrote:
>>>
 Hi all,

 The PMC recently voted to add Yanbo Liang as a committer. Yanbo has
 been a super active contributor in many areas of MLlib. Please join me in
 welcoming Yanbo!

 Matei
 -
 To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
 For additional commands, e-mail: dev-h...@spark.apache.org


>>>
>>
>


Re: Welcoming Yanbo Liang as a committer

2016-06-03 Thread Dongjoon Hyun
Wow, Congratulations, Yanbo!

Dongjoon.

On Fri, Jun 3, 2016 at 8:22 PM, Xiao Li  wrote:

> Congratulations, Yanbo!
>
> 2016-06-03 19:54 GMT-07:00 Nan Zhu :
>
>> Congratulations !
>>
>> --
>> Nan Zhu
>>
>> On June 3, 2016 at 10:50:33 PM, Ted Yu (yuzhih...@gmail.com) wrote:
>>
>> Congratulations, Yanbo.
>>
>> On Fri, Jun 3, 2016 at 7:48 PM, Matei Zaharia 
>> wrote:
>>
>>> Hi all,
>>>
>>> The PMC recently voted to add Yanbo Liang as a committer. Yanbo has been
>>> a super active contributor in many areas of MLlib. Please join me in
>>> welcoming Yanbo!
>>>
>>> Matei
>>> -
>>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: dev-h...@spark.apache.org
>>>
>>>
>>
>


Re: Welcoming Yanbo Liang as a committer

2016-06-03 Thread Mridul Muralidharan
Congratulations Yanbo !

Regards
Mridul

On Friday, June 3, 2016, Matei Zaharia  wrote:

> Hi all,
>
> The PMC recently voted to add Yanbo Liang as a committer. Yanbo has been a
> super active contributor in many areas of MLlib. Please join me in
> welcoming Yanbo!
>
> Matei
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org 
> For additional commands, e-mail: dev-h...@spark.apache.org 
>
>


Re: Welcoming Yanbo Liang as a committer

2016-06-03 Thread Xiao Li
Congratulations, Yanbo!

2016-06-03 19:54 GMT-07:00 Nan Zhu :

> Congratulations !
>
> --
> Nan Zhu
>
> On June 3, 2016 at 10:50:33 PM, Ted Yu (yuzhih...@gmail.com) wrote:
>
> Congratulations, Yanbo.
>
> On Fri, Jun 3, 2016 at 7:48 PM, Matei Zaharia 
> wrote:
>
>> Hi all,
>>
>> The PMC recently voted to add Yanbo Liang as a committer. Yanbo has been
>> a super active contributor in many areas of MLlib. Please join me in
>> welcoming Yanbo!
>>
>> Matei
>> -
>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> For additional commands, e-mail: dev-h...@spark.apache.org
>>
>>
>


Re: Welcoming Yanbo Liang as a committer

2016-06-03 Thread Nan Zhu
Congratulations !

-- 
Nan Zhu
On June 3, 2016 at 10:50:33 PM, Ted Yu (yuzhih...@gmail.com) wrote:

Congratulations, Yanbo.

On Fri, Jun 3, 2016 at 7:48 PM, Matei Zaharia  wrote:
Hi all,

The PMC recently voted to add Yanbo Liang as a committer. Yanbo has been a 
super active contributor in many areas of MLlib. Please join me in welcoming 
Yanbo!

Matei
-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org




Re: Welcoming Yanbo Liang as a committer

2016-06-03 Thread Takeshi Yamamuro
congrats!

// maropu

On Sat, Jun 4, 2016 at 11:50 AM, Ted Yu  wrote:

> Congratulations, Yanbo.
>
> On Fri, Jun 3, 2016 at 7:48 PM, Matei Zaharia 
> wrote:
>
>> Hi all,
>>
>> The PMC recently voted to add Yanbo Liang as a committer. Yanbo has been
>> a super active contributor in many areas of MLlib. Please join me in
>> welcoming Yanbo!
>>
>> Matei
>> -
>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> For additional commands, e-mail: dev-h...@spark.apache.org
>>
>>
>


-- 
---
Takeshi Yamamuro


Re: Welcoming Yanbo Liang as a committer

2016-06-03 Thread Ted Yu
Congratulations, Yanbo.

On Fri, Jun 3, 2016 at 7:48 PM, Matei Zaharia 
wrote:

> Hi all,
>
> The PMC recently voted to add Yanbo Liang as a committer. Yanbo has been a
> super active contributor in many areas of MLlib. Please join me in
> welcoming Yanbo!
>
> Matei
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>


Welcoming Yanbo Liang as a committer

2016-06-03 Thread Matei Zaharia
Hi all,

The PMC recently voted to add Yanbo Liang as a committer. Yanbo has been a 
super active contributor in many areas of MLlib. Please join me in welcoming 
Yanbo!

Matei
-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-03 Thread Mark Hamstra
It's not a question of whether the preview artifacts can be made available
on Maven central, but rather whether they must be or should be.  I've got
no problems leaving these unstable, transitory artifacts out of the more
permanent, canonical repository.

On Fri, Jun 3, 2016 at 1:53 AM, Steve Loughran 
wrote:

>
> It's been voted on by the project, so can go up on central
>
> There's already some JIRAs being filed against it, this is a metric of
> success as pre-beta of the artifacts.
>
> The risk of exercising the m2 central option is that people may get
> expectations that they can point their code at the 2.0.0-preview and then,
> when a release comes out, simply
> update their dependency; this may/may not be the case. But is it harmful
> if people do start building and testing against the preview? If it finds
> problems early, it can only be a good thing
>
>
> > On 1 Jun 2016, at 23:10, Sean Owen  wrote:
> >
> > I'll be more specific about the issue that I think trumps all this,
> > which I realize maybe not everyone was aware of.
> >
> > There was a long and contentious discussion on the PMC about, among
> > other things, advertising a "Spark 2.0 preview" from Databricks, such
> > as at
> https://databricks.com/blog/2016/05/11/apache-spark-2-0-technical-preview-easier-faster-and-smarter.html
> >
> > That post has already been updated/fixed from an earlier version, but
> > part of the resolution was to make a full "2.0.0 preview" release in
> > order to continue to be able to advertise it as such. Without it, I
> > believe the PMC's conclusion remains that this blog post / product
> > announcement is not allowed by ASF policy. Hence, either the product
> > announcements need to be taken down and a bunch of wording changed in
> > the Databricks product, or, this needs to be a normal release.
> >
> > Obviously, it seems far easier to just finish the release per usual. I
> > actually didn't realize this had not been offered for download at
> > http://spark.apache.org/downloads.html either. It needs to be
> > accessible there too.
> >
> >
> > We can get back in the weeds about what a "preview" release means,
> > but, normal voted releases can and even should be alpha/beta
> > (http://www.apache.org/dev/release.html) The culture is, in theory, to
> > release early and often. I don't buy an argument that it's too old, at
> > 2 weeks, when the alternative is having nothing at all to test
> > against.
> >
> > On Wed, Jun 1, 2016 at 5:02 PM, Michael Armbrust 
> wrote:
> >>> I'd think we want less effort, not more, to let people test it? for
> >>> example, right now I can't easily try my product build against
> >>> 2.0.0-preview.
> >>
> >>
> >> I don't feel super strongly one way or the other, so if we need to
> publish
> >> it permanently we can.
> >>
> >> However, either way you can still test against this release.  You just
> need
> >> to add a resolver as well (which is how I have always tested packages
> >> against RCs).  One concern with making it permeant is this preview
> release
> >> is already fairly far behind branch-2.0, so many of the issues that
> people
> >> might report have already been fixed and that might continue even after
> the
> >> release is made.  I'd rather be able to force upgrades eventually when
> we
> >> vote on the final 2.0 release.
> >>
> >
> > -
> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> > For additional commands, e-mail: dev-h...@spark.apache.org
> >
> >
>
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>


RE: Where is DataFrame.scala in 2.0?

2016-06-03 Thread Gerhard Fiedler
Thanks!

From: Herman van Hövell tot Westerflier [mailto:hvanhov...@questtec.nl]
Sent: Fri, Jun 03, 2016 10:05
To: Gerhard Fiedler 
Cc: dev@spark.apache.org
Subject: Re: Where is DataFrame.scala in 2.0?

Hi Gerhard,

DataFrame and DataSet have been merged in Spark 2.0. A DataFrame is now a 
DataSet that contains Row objects. We still maintain a type alias for 
DataFrame: 
https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/package.scala#L45

HTH

Kind regards,

Herman van Hövell tot Westerflier

2016-06-03 17:01 GMT+02:00 Gerhard Fiedler 
>:
When I look at the sources in Github, I see DataFrame.scala at 
https://github.com/apache/spark/blob/branch-1.6/sql/core/src/main/scala/org/apache/spark/sql/DataFrame.scala
 in the 1.6 branch. But when I change the branch to branch-2.0 or master, I get 
a 404 error. I also can’t find the file in the directory listings, for example 
https://github.com/apache/spark/tree/branch-2.0/sql/core/src/main/scala/org/apache/spark/sql
 (for branch-2.0).

It seems that quite a few APIs use the DataFrame class, even in 2.0. Can 
someone please point me to its location, or otherwise explain why it is not 
there?

Thanks,
Gerhard




Re: Where is DataFrame.scala in 2.0?

2016-06-03 Thread Michael Malak
It's been reduced to a single line of code.
http://technicaltidbit.blogspot.com/2016/03/dataframedataset-swap-places-in-spark-20.html




  From: Gerhard Fiedler 
 To: "dev@spark.apache.org"  
 Sent: Friday, June 3, 2016 9:01 AM
 Subject: Where is DataFrame.scala in 2.0?
   
 When I look at the sources in Github, I see 
DataFrame.scala 
athttps://github.com/apache/spark/blob/branch-1.6/sql/core/src/main/scala/org/apache/spark/sql/DataFrame.scala
 in the 1.6 branch. But when I change the branch to branch-2.0 or master, I get 
a 404 error. I also can’t find the file in the directory listings, for example 
https://github.com/apache/spark/tree/branch-2.0/sql/core/src/main/scala/org/apache/spark/sql
 (for branch-2.0).    It seems that quite a few APIs use the DataFrame class, 
even in 2.0. Can someone please point me to its location, or otherwise explain 
why it is not there?    Thanks, Gerhard    

  

Re: Where is DataFrame.scala in 2.0?

2016-06-03 Thread Herman van Hövell tot Westerflier
Hi Gerhard,

DataFrame and DataSet have been merged in Spark 2.0. A DataFrame is now a
DataSet that contains Row objects. We still maintain a type alias for
DataFrame:
https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/package.scala#L45

HTH

Kind regards,

Herman van Hövell tot Westerflier

2016-06-03 17:01 GMT+02:00 Gerhard Fiedler :

> When I look at the sources in Github, I see DataFrame.scala at
> https://github.com/apache/spark/blob/branch-1.6/sql/core/src/main/scala/org/apache/spark/sql/DataFrame.scala
> in the 1.6 branch. But when I change the branch to branch-2.0 or master, I
> get a 404 error. I also can’t find the file in the directory listings, for
> example
> https://github.com/apache/spark/tree/branch-2.0/sql/core/src/main/scala/org/apache/spark/sql
> (for branch-2.0).
>
>
>
> It seems that quite a few APIs use the DataFrame class, even in 2.0. Can
> someone please point me to its location, or otherwise explain why it is not
> there?
>
>
>
> Thanks,
>
> Gerhard
>
>
>


Where is DataFrame.scala in 2.0?

2016-06-03 Thread Gerhard Fiedler
When I look at the sources in Github, I see DataFrame.scala at 
https://github.com/apache/spark/blob/branch-1.6/sql/core/src/main/scala/org/apache/spark/sql/DataFrame.scala
 in the 1.6 branch. But when I change the branch to branch-2.0 or master, I get 
a 404 error. I also can't find the file in the directory listings, for example 
https://github.com/apache/spark/tree/branch-2.0/sql/core/src/main/scala/org/apache/spark/sql
 (for branch-2.0).

It seems that quite a few APIs use the DataFrame class, even in 2.0. Can 
someone please point me to its location, or otherwise explain why it is not 
there?

Thanks,
Gerhard



Any one can help to merge this pull request about Spark Thrift Server HA

2016-06-03 Thread 王晓雨
Hi developers!
I submit a pull request for a long time.
This pull request is about Spark Thrift Server HA issue.
https://github.com/apache/spark/pull/9113
Any one can help to merge this pull request?
Thanks!


Implementing linear albegra operations in the distributed linalg package

2016-06-03 Thread José Manuel Abuín Mosquera

Hello,

I would like to add some linear algebra operations to all the 
DistributedMatrix classes that Spark actually handles (CoordinateMatrix, 
BlockMatrix, IndexedRowMatrix and RowMatrix), but first I would like do 
ask if you consider this useful. (For me, it is)


Of course, these operations will be distributed, but they will rely on 
the local implementation of mllib linalg. For example, when multiplying 
an IndexedRowMatrix by a DenseVector, the multiplication of one of the 
matrix rows by the vector will be performed by using the local 
implementation


What is your opinion about it?

Thank you

--
José Manuel Abuín Mosquera
Pre-doctoral researcher
Centro de Investigación en Tecnoloxías da Información (CiTIUS)
University of Santiago de Compostela
15782 Santiago de Compostela, Spain

http://citius.usc.es/equipo/investigadores-en-formacion/josemanuel.abuin
http://jmabuin.github.io


-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Spark 2.0.0-preview artifacts still not available in Maven

2016-06-03 Thread Steve Loughran

It's been voted on by the project, so can go up on central

There's already some JIRAs being filed against it, this is a metric of success 
as pre-beta of the artifacts.

The risk of exercising the m2 central option is that people may get 
expectations that they can point their code at the 2.0.0-preview and then, when 
a release comes out, simply
update their dependency; this may/may not be the case. But is it harmful if 
people do start building and testing against the preview? If it finds problems 
early, it can only be a good thing


> On 1 Jun 2016, at 23:10, Sean Owen  wrote:
> 
> I'll be more specific about the issue that I think trumps all this,
> which I realize maybe not everyone was aware of.
> 
> There was a long and contentious discussion on the PMC about, among
> other things, advertising a "Spark 2.0 preview" from Databricks, such
> as at 
> https://databricks.com/blog/2016/05/11/apache-spark-2-0-technical-preview-easier-faster-and-smarter.html
> 
> That post has already been updated/fixed from an earlier version, but
> part of the resolution was to make a full "2.0.0 preview" release in
> order to continue to be able to advertise it as such. Without it, I
> believe the PMC's conclusion remains that this blog post / product
> announcement is not allowed by ASF policy. Hence, either the product
> announcements need to be taken down and a bunch of wording changed in
> the Databricks product, or, this needs to be a normal release.
> 
> Obviously, it seems far easier to just finish the release per usual. I
> actually didn't realize this had not been offered for download at
> http://spark.apache.org/downloads.html either. It needs to be
> accessible there too.
> 
> 
> We can get back in the weeds about what a "preview" release means,
> but, normal voted releases can and even should be alpha/beta
> (http://www.apache.org/dev/release.html) The culture is, in theory, to
> release early and often. I don't buy an argument that it's too old, at
> 2 weeks, when the alternative is having nothing at all to test
> against.
> 
> On Wed, Jun 1, 2016 at 5:02 PM, Michael Armbrust  
> wrote:
>>> I'd think we want less effort, not more, to let people test it? for
>>> example, right now I can't easily try my product build against
>>> 2.0.0-preview.
>> 
>> 
>> I don't feel super strongly one way or the other, so if we need to publish
>> it permanently we can.
>> 
>> However, either way you can still test against this release.  You just need
>> to add a resolver as well (which is how I have always tested packages
>> against RCs).  One concern with making it permeant is this preview release
>> is already fairly far behind branch-2.0, so many of the issues that people
>> might report have already been fixed and that might continue even after the
>> release is made.  I'd rather be able to force upgrades eventually when we
>> vote on the final 2.0 release.
>> 
> 
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
> 
> 


-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org