Re: Upgrade to spark 1.0.x

2014-08-11 Thread Dmitriy Lyubimov
ok. merging and dropping spark 1.0.x branch from apache as well.


On Sat, Aug 9, 2014 at 2:06 PM, Peng Cheng pc...@uowmail.edu.au wrote:

 +1

 1.0.0 is recommended. Many release after 1.0.1 has a short test cycle and
 1.0.2 apparently reverted many fix for causing more serious problem.


 On 14-08-09 04:51 PM, Ted Dunning wrote:

 +1

 Until we release a version that uses spark, we should stay with what helps
 us.  Once a release goes out then tracking whichever version of spark that
 the big distros put out becomes more important.



 On Sat, Aug 9, 2014 at 9:57 AM, Pat Ferrel pat.fer...@gmail.com wrote:

  +1

 Seems like we ought to keep up to the bleeding edge until the next Mahout
 release, that’s when the pain of upgrade gets spread much wider. In fact
 if
 Spark gets moved to Scala 2.11 before our release we probably should
 consider upgrading Scala too.





Re: Upgrade to spark 1.0.x

2014-08-09 Thread Pat Ferrel
+1

Seems like we ought to keep up to the bleeding edge until the next Mahout 
release, that’s when the pain of upgrade gets spread much wider. In fact if 
Spark gets moved to Scala 2.11 before our release we probably should consider 
upgrading Scala too.

Re: Upgrade to spark 1.0.x

2014-08-09 Thread Ted Dunning
+1

Until we release a version that uses spark, we should stay with what helps
us.  Once a release goes out then tracking whichever version of spark that
the big distros put out becomes more important.



On Sat, Aug 9, 2014 at 9:57 AM, Pat Ferrel pat.fer...@gmail.com wrote:

 +1

 Seems like we ought to keep up to the bleeding edge until the next Mahout
 release, that’s when the pain of upgrade gets spread much wider. In fact if
 Spark gets moved to Scala 2.11 before our release we probably should
 consider upgrading Scala too.


Re: Upgrade to spark 1.0.x

2014-08-09 Thread Peng Cheng

+1

1.0.0 is recommended. Many release after 1.0.1 has a short test cycle 
and 1.0.2 apparently reverted many fix for causing more serious problem.


On 14-08-09 04:51 PM, Ted Dunning wrote:

+1

Until we release a version that uses spark, we should stay with what helps
us.  Once a release goes out then tracking whichever version of spark that
the big distros put out becomes more important.



On Sat, Aug 9, 2014 at 9:57 AM, Pat Ferrel pat.fer...@gmail.com wrote:


+1

Seems like we ought to keep up to the bleeding edge until the next Mahout
release, that’s when the pain of upgrade gets spread much wider. In fact if
Spark gets moved to Scala 2.11 before our release we probably should
consider upgrading Scala too.




Re: Upgrade to spark 1.0.x

2014-08-08 Thread Gokhan Capan
+1 to merging spark-1.0.x to master

Sent from my iPhone

 On Aug 8, 2014, at 22:06, Dmitriy Lyubimov dlie...@gmail.com wrote:

 Current master is still at Spark 0.9.x . MAHOUT-1603 (PR #40) is making a
 number of valuable tweaks to enable Spark 1.0.x and (Spark SQL code, by
 extension. I did a quick test, SQL seems to work for my simple tests in
 Mahout environment).

 This squashed PR is pushed to apache/mahout branch spark-1.0.x rather than
 master. Whenever (if) folks are ready, i can merge it to the master.

 Alternative approach would be to maintain both 1.0.x and 0.9.x branches for
 some time. I don't see it as valuable as the costs would likely overrun any
 benefit here, but if anyone still clings to spark 0.9.x dependency, please
 let me know in this thread.

 thanks.
 -d


Re: Upgrade to spark 1.0.x

2014-08-08 Thread Ted Dunning
+1 to merge




On Fri, Aug 8, 2014 at 12:36 PM, Gokhan Capan gkhn...@gmail.com wrote:

 +1 to merging spark-1.0.x to master

 Sent from my iPhone

  On Aug 8, 2014, at 22:06, Dmitriy Lyubimov dlie...@gmail.com wrote:
 
  Current master is still at Spark 0.9.x . MAHOUT-1603 (PR #40) is making a
  number of valuable tweaks to enable Spark 1.0.x and (Spark SQL code, by
  extension. I did a quick test, SQL seems to work for my simple tests in
  Mahout environment).
 
  This squashed PR is pushed to apache/mahout branch spark-1.0.x rather
 than
  master. Whenever (if) folks are ready, i can merge it to the master.
 
  Alternative approach would be to maintain both 1.0.x and 0.9.x branches
 for
  some time. I don't see it as valuable as the costs would likely overrun
 any
  benefit here, but if anyone still clings to spark 0.9.x dependency,
 please
  let me know in this thread.
 
  thanks.
  -d



Re: Upgrade to spark 1.0.x

2014-08-08 Thread Suneel Marthi
+1


On Fri, Aug 8, 2014 at 3:48 PM, Ted Dunning ted.dunn...@gmail.com wrote:

 +1 to merge




 On Fri, Aug 8, 2014 at 12:36 PM, Gokhan Capan gkhn...@gmail.com wrote:

  +1 to merging spark-1.0.x to master
 
  Sent from my iPhone
 
   On Aug 8, 2014, at 22:06, Dmitriy Lyubimov dlie...@gmail.com wrote:
  
   Current master is still at Spark 0.9.x . MAHOUT-1603 (PR #40) is
 making a
   number of valuable tweaks to enable Spark 1.0.x and (Spark SQL code, by
   extension. I did a quick test, SQL seems to work for my simple tests in
   Mahout environment).
  
   This squashed PR is pushed to apache/mahout branch spark-1.0.x rather
  than
   master. Whenever (if) folks are ready, i can merge it to the master.
  
   Alternative approach would be to maintain both 1.0.x and 0.9.x branches
  for
   some time. I don't see it as valuable as the costs would likely overrun
  any
   benefit here, but if anyone still clings to spark 0.9.x dependency,
  please
   let me know in this thread.
  
   thanks.
   -d
 



Re: Upgrade to spark 1.0.x

2014-08-08 Thread Shannon Quinn

+1

On 8/8/14, 3:58 PM, Suneel Marthi wrote:

+1


On Fri, Aug 8, 2014 at 3:48 PM, Ted Dunning ted.dunn...@gmail.com wrote:


+1 to merge




On Fri, Aug 8, 2014 at 12:36 PM, Gokhan Capan gkhn...@gmail.com wrote:


+1 to merging spark-1.0.x to master

Sent from my iPhone


On Aug 8, 2014, at 22:06, Dmitriy Lyubimov dlie...@gmail.com wrote:

Current master is still at Spark 0.9.x . MAHOUT-1603 (PR #40) is

making a

number of valuable tweaks to enable Spark 1.0.x and (Spark SQL code, by
extension. I did a quick test, SQL seems to work for my simple tests in
Mahout environment).

This squashed PR is pushed to apache/mahout branch spark-1.0.x rather

than

master. Whenever (if) folks are ready, i can merge it to the master.

Alternative approach would be to maintain both 1.0.x and 0.9.x branches

for

some time. I don't see it as valuable as the costs would likely overrun

any

benefit here, but if anyone still clings to spark 0.9.x dependency,

please

let me know in this thread.

thanks.
-d