Re: Spark 1.6.1: Unexpected partition behavior?

2016-06-26 Thread Randy Gelhausen
Sorry, please ignore the above. I now see I called coalesce on a different reference, than I used to register the table. On Sun, Jun 26, 2016 at 6:34 PM, Randy Gelhausen wrote: > > val enriched_web_logs = sqlContext.sql(""" > select web_logs.datetime, web_logs.node as

Spark 1.6.1: Unexpected partition behavior?

2016-06-26 Thread Randy Gelhausen
val enriched_web_logs = sqlContext.sql(""" select web_logs.datetime, web_logs.node as app_host, source_ip, b.node as source_host, log from web_logs left outer join (select distinct node, address from nodes) b on source_ip = address """)

Re: Spark 1.6.1 packages on S3 corrupt?

2016-04-12 Thread Nicholas Chammas
Yes, this is a known issue. The core devs are already aware of it. [CC dev] FWIW, I believe the Spark 1.6.1 / Hadoop 2.6 package on S3 is not corrupt. It may be the only 1.6.1 package that is not corrupt, though. :/ Nick On Tue, Apr 12, 2016 at 9:00 PM Augustus Hong <augus...@branchmetrics

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-11 Thread Ted Yu
Gentle ping: spark-1.6.1-bin-hadoop2.4.tgz from S3 is still corrupt. On Wed, Apr 6, 2016 at 12:55 PM, Josh Rosen <joshro...@databricks.com> wrote: > Sure, I'll take a look. Planning to do full verification in a bit. > > On Wed, Apr 6, 2016 at 12:54 PM Ted Yu <yuzhih..

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-06 Thread Josh Rosen
Sure, I'll take a look. Planning to do full verification in a bit. On Wed, Apr 6, 2016 at 12:54 PM Ted Yu <yuzhih...@gmail.com> wrote: > Josh: > Can you check spark-1.6.1-bin-hadoop2.4.tgz ? > > $ tar zxf spark-1.6.1-bin-hadoop2.4.tgz > > gzip: stdin: not in gzip form

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-06 Thread Ted Yu
Josh: Can you check spark-1.6.1-bin-hadoop2.4.tgz ? $ tar zxf spark-1.6.1-bin-hadoop2.4.tgz gzip: stdin: not in gzip format tar: Child returned status 1 tar: Error is not recoverable: exiting now $ ls -l !$ ls -l spark-1.6.1-bin-hadoop2.4.tgz -rw-r--r--. 1 hbase hadoop 323614720 Apr 5 19:25

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-06 Thread Nicholas Chammas
Thank you Josh! I confirmed that the Spark 1.6.1 / Hadoop 2.6 package on S3 is now working, and the SHA512 checks out. On Wed, Apr 6, 2016 at 3:19 PM Josh Rosen <joshro...@databricks.com> wrote: > I downloaded the Spark 1.6.1 artifacts from the Apache mirror network and > re-u

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-06 Thread Josh Rosen
I downloaded the Spark 1.6.1 artifacts from the Apache mirror network and re-uploaded them to the spark-related-packages S3 bucket, so hopefully these packages should be fixed now. On Mon, Apr 4, 2016 at 3:37 PM Nicholas Chammas <nicholas.cham...@gmail.com> wrote: > Thanks, that was th

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Nicholas Chammas
gmail.com> wrote: > >> An additional note: The Spark packages being served off of CloudFront > (i.e. > >> the “direct download” option on spark.apache.org) are also corrupt. > >> > >> Btw what’s the correct way to verify the SHA of a Spark package? I’ve > trie

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Jakob Odersky
s the correct way to verify the SHA of a Spark package? I’ve tried >> a few commands on working packages downloaded from Apache mirrors, but I >> can’t seem to reproduce the published SHA for spark-1.6.1-bin-hadoop2.6.tgz. >> >> >> On Mon, Apr 4, 2016 at 11:45 AM Ted Yu <

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Jakob Odersky
’s the correct way to verify the SHA of a Spark package? I’ve tried > a few commands on working packages downloaded from Apache mirrors, but I > can’t seem to reproduce the published SHA for spark-1.6.1-bin-hadoop2.6.tgz. > > > On Mon, Apr 4, 2016 at 11:45 AM Ted Yu <yuzhih...@gmail.com&

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Nicholas Chammas
can’t seem to reproduce the published SHA for spark-1.6.1-bin-hadoop2.6.tgz <http://www.apache.org/dist/spark/spark-1.6.1/spark-1.6.1-bin-hadoop2.6.tgz.sha> . ​ On Mon, Apr 4, 2016 at 11:45 AM Ted Yu <yuzhih...@gmail.com> wrote: > Maybe temporarily take out the artifacts on S3 before

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Ted Yu
e packages? It's still a problem. >> >> Also, it would be good to understand why this is happening. >> >> On Fri, Mar 18, 2016 at 6:49 PM Jakob Odersky <ja...@odersky.com> wrote: >> >>> I just realized you're using a different download site. Sorry for

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Kousuke Saruta
t;> wrote: Oh, I overlooked that. Thanks. Kousuke On 2016/04/04 22:58, Nicholas Chammas wrote: This is still an issue. The Spark 1.6.1 packages on S3 are corrupt. Is anyone looking into this issue? Is there anything contributors can do to help solve this problem? Nick

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Jitendra Shelar
We can think of using checksum for this kind of issues. On Mon, Apr 4, 2016 at 8:32 PM, Kousuke Saruta <saru...@oss.nttdata.co.jp> wrote: > Oh, I overlooked that. Thanks. > > Kousuke > > > On 2016/04/04 22:58, Nicholas Chammas wrote: > > This is still an issue.

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Kousuke Saruta
Oh, I overlooked that. Thanks. Kousuke On 2016/04/04 22:58, Nicholas Chammas wrote: This is still an issue. The Spark 1.6.1 packages on S3 are corrupt. Is anyone looking into this issue? Is there anything contributors can do to help solve this problem? Nick On Sun, Mar 27, 2016 at 8:49

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-04-04 Thread Nicholas Chammas
This is still an issue. The Spark 1.6.1 packages on S3 are corrupt. Is anyone looking into this issue? Is there anything contributors can do to help solve this problem? Nick On Sun, Mar 27, 2016 at 8:49 PM Nicholas Chammas <nicholas.cham...@gmail.com> wrote: > Pingity-ping-p

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-27 Thread Nicholas Chammas
On Fri, Mar 18, 2016 at 6:49 PM Jakob Odersky <ja...@odersky.com> wrote: >>> >>>> I just realized you're using a different download site. Sorry for the >>>> confusion, the link I get for a direct download of Spark 1.6.1 / >>>> Hadoop 2.6 is >>>>

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-24 Thread Michael Armbrust
it would be good to understand why this is happening. >> >> On Fri, Mar 18, 2016 at 6:49 PM Jakob Odersky <ja...@odersky.com> wrote: >> >>> I just realized you're using a different download site. Sorry for the >>> confusion, the link I get for a direct downlo

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-24 Thread Nicholas Chammas
're using a different download site. Sorry for the >> confusion, the link I get for a direct download of Spark 1.6.1 / >> Hadoop 2.6 is >> http://d3kbcqa49mib13.cloudfront.net/spark-1.6.1-bin-hadoop2.6.tgz >> >> On Fri, Mar 18, 2016 at 3:20 PM, Nicholas Chammas >> <ni

Re: error occurs to compile spark 1.6.1 using scala 2.11.8

2016-03-22 Thread Ted Yu
the > attached. I fired the build process by clicking "Rebuild Project" in > "Build" menu in IDEA IDE. > > more info here: > Spark 1.6.1 + scala 2.11.8 + IDEA 15.0.3 + Maven 3.3.3 > > I can build spark 1.6.1 with scala 2.10.4 su

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-21 Thread Nicholas Chammas
t; confusion, the link I get for a direct download of Spark 1.6.1 / > Hadoop 2.6 is > http://d3kbcqa49mib13.cloudfront.net/spark-1.6.1-bin-hadoop2.6.tgz > > On Fri, Mar 18, 2016 at 3:20 PM, Nicholas Chammas > <nicholas.cham...@gmail.com> wrote: > > I just retried the Spark 1.6.1

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-20 Thread Nicholas Chammas
I'm seeing the same. :( On Fri, Mar 18, 2016 at 10:57 AM Ted Yu <yuzhih...@gmail.com> wrote: > I tried again this morning : > > $ wget > https://s3.amazonaws.com/spark-related-packages/spark-1.6.1-bin-hadoop2.6.tgz > --2016-03-18 07:55:30-- > https://s3.amazonaws.com/spar

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-19 Thread Ted Yu
I tried again this morning : $ wget https://s3.amazonaws.com/spark-related-packages/spark-1.6.1-bin-hadoop2.6.tgz --2016-03-18 07:55:30-- https://s3.amazonaws.com/spark-related-packages/spark-1.6.1-bin-hadoop2.6.tgz Resolving s3.amazonaws.com... 54.231.19.163 ... $ tar zxf spark-1.6.1-bin

Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-19 Thread Nicholas Chammas
https://s3.amazonaws.com/spark-related-packages/spark-1.6.1-bin-hadoop2.6.tgz Does anyone else have trouble unzipping this? How did this happen? What I get is: $ gzip -t spark-1.6.1-bin-hadoop2.6.tgz gzip: spark-1.6.1-bin-hadoop2.6.tgz: unexpected end of file gzip: spark-1.6.1-bin-hadoop2.6.tgz

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-19 Thread Ted Yu
On Linux, I got: $ tar zxf spark-1.6.1-bin-hadoop2.6.tgz gzip: stdin: unexpected end of file tar: Unexpected EOF in archive tar: Unexpected EOF in archive tar: Error is not recoverable: exiting now On Wed, Mar 16, 2016 at 5:15 PM, Nicholas Chammas < nicholas.cham...@gmail.com> wrote: >

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-19 Thread Nicholas Chammas
euploaded the artifacts, so it should be fixed now. > On Mar 16, 2016 5:48 PM, "Nicholas Chammas" <nicholas.cham...@gmail.com> > wrote: > >> Looks like the other packages may also be corrupt. I’m getting the same >> error for the Spark 1.6.1 / Hadoop 2.4 package. &

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-19 Thread Nicholas Chammas
Looks like the other packages may also be corrupt. I’m getting the same error for the Spark 1.6.1 / Hadoop 2.4 package. https://s3.amazonaws.com/spark-related-packages/spark-1.6.1-bin-hadoop2.4.tgz Nick ​ On Wed, Mar 16, 2016 at 8:28 PM Ted Yu <yuzhih...@gmail.com> wrote: > On Lin

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-19 Thread Ted Yu
Same with hadoop 2.3 tar ball: $ tar zxf spark-1.6.1-bin-hadoop2.3.tgz gzip: stdin: unexpected end of file tar: Unexpected EOF in archive tar: Unexpected EOF in archive tar: Error is not recoverable: exiting now On Wed, Mar 16, 2016 at 5:47 PM, Nicholas Chammas < nicholas.cham...@gmail.

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-18 Thread Michael Armbrust
Patrick reuploaded the artifacts, so it should be fixed now. On Mar 16, 2016 5:48 PM, "Nicholas Chammas" <nicholas.cham...@gmail.com> wrote: > Looks like the other packages may also be corrupt. I’m getting the same > error for the Spark 1.6.1 / Hadoop 2.4 package. > >

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-18 Thread Jakob Odersky
> I'm seeing the same. :( > > On Fri, Mar 18, 2016 at 10:57 AM Ted Yu <yuzhih...@gmail.com> wrote: >> >> I tried again this morning : >> >> $ wget >> https://s3.amazonaws.com/spark-related-packages/spark-1.6.1-bin-hadoop2.6.tgz >> --2016-03-18 07:5

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-18 Thread Nicholas Chammas
I just retried the Spark 1.6.1 / Hadoop 2.6 download and got a corrupt ZIP file. Jakob, are you sure the ZIP unpacks correctly for you? Is it the same Spark 1.6.1/Hadoop 2.6 package you had a success with? On Fri, Mar 18, 2016 at 6:11 PM Jakob Odersky <ja...@odersky.com> wrote: &g

Re: Spark 1.6.1 Hadoop 2.6 package on S3 corrupt?

2016-03-18 Thread Jakob Odersky
I just realized you're using a different download site. Sorry for the confusion, the link I get for a direct download of Spark 1.6.1 / Hadoop 2.6 is http://d3kbcqa49mib13.cloudfront.net/spark-1.6.1-bin-hadoop2.6.tgz On Fri, Mar 18, 2016 at 3:20 PM, Nicholas Chammas <nicholas.cham...@gmail.

[ANNOUNCE] Announcing Spark 1.6.1

2016-03-10 Thread Michael Armbrust
Spark 1.6.1 is a maintenance release containing stability fixes. This release is based on the branch-1.6 maintenance branch of Spark. We *strongly recommend* all 1.6.0 users to upgrade to this release. Notable fixes include: - Workaround for OOM when writing large partitioned tables SPARK-12546

[RESULT] [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-09 Thread Michael Armbrust
scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)^[[0m >>>>>>>> ^[[31m at >>>>>>>> org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58)^[[0m >>>>>>>> ^[[31m ...^[[0m

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-09 Thread Michael Armbrust
t;> ^[[31m ...^[[0m >>>>>>> ^[[31m Cause: >>>>>>> com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: >>>>>>> java.io.IOException: No such file or directory^[[0m >>>>>>> ^[[31m at >>>>

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-09 Thread Kousuke Saruta
3+1 PMC votes are cast. [ ] +1 Release this package as Apache Spark 1.6.1 [ ] -1 Do not release this package because ... To learn more about Apache Spark, please s

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-08 Thread Burak Yavuz
m at >>>>>> org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:481)^[[0m >>>>>> ^[[31m at >>>>>> org.glassfish.jersey.apache.connector.ApacheConnector$1.run(ApacheConnector.java:491)^[[0m >>>>>> ^[[3

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-08 Thread Andrew Or
ava:262)^[[0m >>>>> ^[[31m at >>>>> jersey.repackaged.com.google.common.util.concurrent.MoreExecutors$DirectExecutorService.execute(MoreExecutors.java:299)^[[0m >>>>> ^[[31m at >>>>> java.util.concurrent.AbstractExecutorService.submit(

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-08 Thread Yin Huai
t;>>> java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:110)^[[0m >>>> ^[[31m at >>>> jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:50)^[[0m >>&g

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-07 Thread Reynold Xin
isteningExecutorService.submit(AbstractListeningExecutorService.java:50)^[[0m >>> ^[[31m at >>> jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:37)^[[0m >>> ^[[31m at >>> org.gla

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-06 Thread Egor Pahomov
ubmit(AbstractListeningExecutorService.java:37)^[[0m >> ^[[31m at >> org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:487)^[[0m >> ^[[31m at >> org.glassfish.jersey.client.ClientRuntime$2.run(ClientRuntime.java:177)^[[0m >> ^[[31m ...^[[0m >&g

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-03 Thread Sean Owen
FWIW I was running this with OpenJDK 1.8.0_66 On Thu, Mar 3, 2016 at 7:43 PM, Tim Preece wrote: > Regarding the failure in > org.apache.spark.streaming.kafka.DirectKafkaStreamSuite","offset recovery > > We have been seeing the very same problem with the IBM JDK for quite a

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-03 Thread Tim Preece
Regarding the failure in org.apache.spark.streaming.kafka.DirectKafkaStreamSuite","offset recovery We have been seeing the very same problem with the IBM JDK for quite a long time ( since at least July 2015 ). It is intermittent and we had dismissed it as a testcase problem. -- View this

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-03 Thread Sean Owen
1 Release this package as Apache Spark 1.6.1 >> [ ] -1 Do not release this package because ... >> >> To learn more about Apache Spark, please see http://spark.apache.org/ >> >> The tag to be voted on is v1.6.1-rc1 >> (15de51c238a7340fa81cb0b80d029a05d97bfc5c) &

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-03 Thread Yin Yang
il Saturday, March 5, 2016 at 20:00 UTC and passes if > a majority of at least 3+1 PMC votes are cast. > > [ ] +1 Release this package as Apache Spark 1.6.1 > [ ] -1 Do not release this package because ... > > To learn more about Apache Spark, please see http://spark.apache.org/ &

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-03 Thread Tim Preece
I just created the following pull request ( against master but would like on 1.6.1 ) for the isolated classloader fix ( Spark-13648 ) https://github.com/apache/spark/pull/11495 -- View this message in context:

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-03 Thread Tim Preece
I have been testing 1.6.1RC1 using the IBM Java SDK. I notice a problem ( with the org.apache.spark.sql.hive.client.VersionsSuite tests ) after a recent Spark 1.6.1 change. Pull request - https://github.com/apache/spark/commit/f7898f9e2df131fa78200f6034508e74a78c2a44 The change introduced

Re: [VOTE] Release Apache Spark 1.6.1 (RC1)

2016-03-02 Thread Mark Hamstra
+1 PMC votes are cast. > > [ ] +1 Release this package as Apache Spark 1.6.1 > [ ] -1 Do not release this package because ... > > To learn more about Apache Spark, please see http://spark.apache.org/ > > The tag to be voted on is *v1.6.1-rc1 > (15de51c238a7340fa81cb0b80d029a0

Re: Spark 1.6.1

2016-02-26 Thread Josh Rosen
I updated the release packaging scripts to use SFTP via the *lftp* client: https://github.com/apache/spark/pull/11350 I'm starting the process of cutting a 1.6.1-RC1 tag and release artifacts right now, so please be extra careful about merging into branch-1.6 until after the release. Once the RC

Re: Spark 1.6.1

2016-02-24 Thread Yin Yang
Have you tried using scp ? scp file i...@people.apache.org Thanks On Wed, Feb 24, 2016 at 5:04 PM, Michael Armbrust wrote: > Unfortunately I don't think thats sufficient as they don't seem to support > sftp in the same way they did before. We'll still need to update

Re: Spark 1.6.1

2016-02-24 Thread Michael Armbrust
Unfortunately I don't think thats sufficient as they don't seem to support sftp in the same way they did before. We'll still need to update our release scripts. On Wed, Feb 24, 2016 at 2:09 AM, Yin Yang wrote: > Looks like access to people.apache.org has been restored. > >

Re: Spark 1.6.1

2016-02-24 Thread Yin Yang
Looks like access to people.apache.org has been restored. FYI On Mon, Feb 22, 2016 at 10:07 PM, Luciano Resende wrote: > > > On Mon, Feb 22, 2016 at 9:08 PM, Michael Armbrust > wrote: > >> An update: people.apache.org has been shut down so the

Re: Spark 1.6.1

2016-02-22 Thread Reynold Xin
Yes, we don't want to clutter maven central. The staging repo is included in the release candidate voting thread. See the following for an example: http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-6-0-RC1-td15424.html On Mon, Feb 22, 2016 at 11:37 PM, Romi

Re: Spark 1.6.1

2016-02-22 Thread Romi Kuntsman
Sounds fair. Is it to avoid cluttering maven central with too many intermediate versions? What do I need to add in my pom.xml section to make it work? *Romi Kuntsman*, *Big Data Engineer* http://www.totango.com On Tue, Feb 23, 2016 at 9:34 AM, Reynold Xin wrote: > We

Re: Spark 1.6.1

2016-02-22 Thread Reynold Xin
We usually publish to a staging maven repo hosted by the ASF (not maven central). On Mon, Feb 22, 2016 at 11:32 PM, Romi Kuntsman wrote: > Is it possible to make RC versions available via Maven? (many projects do > that) > That will make integration much easier, so many more

Re: Spark 1.6.1

2016-02-22 Thread Romi Kuntsman
Is it possible to make RC versions available via Maven? (many projects do that) That will make integration much easier, so many more people can test the version before the final release. Thanks! *Romi Kuntsman*, *Big Data Engineer* http://www.totango.com On Tue, Feb 23, 2016 at 8:07 AM, Luciano

Re: Spark 1.6.1

2016-02-22 Thread Luciano Resende
On Mon, Feb 22, 2016 at 9:08 PM, Michael Armbrust wrote: > An update: people.apache.org has been shut down so the release scripts > are broken. Will try again after we fix them. > > If you skip uploading to people.a.o, it should still be available in nexus for review.

Re: Spark 1.6.1

2016-02-22 Thread Michael Armbrust
An update: people.apache.org has been shut down so the release scripts are broken. Will try again after we fix them. On Mon, Feb 22, 2016 at 6:28 PM, Michael Armbrust wrote: > I've kicked off the build. Please be extra careful about merging into > branch-1.6 until after

Re: Spark 1.6.1

2016-02-22 Thread Michael Armbrust
I've kicked off the build. Please be extra careful about merging into branch-1.6 until after the release. On Mon, Feb 22, 2016 at 10:24 AM, Michael Armbrust wrote: > I will cut the RC today. Sorry for the delay! > > On Mon, Feb 22, 2016 at 5:19 AM, Patrick Woody

Re: Spark 1.6.1

2016-02-22 Thread Michael Armbrust
I will cut the RC today. Sorry for the delay! On Mon, Feb 22, 2016 at 5:19 AM, Patrick Woody wrote: > Hey Michael, > > Any update on a first cut of the RC? > > Thanks! > -Pat > > On Mon, Feb 15, 2016 at 6:50 PM, Michael Armbrust > wrote: > >>

Re: Spark 1.6.1

2016-02-22 Thread Patrick Woody
Hey Michael, Any update on a first cut of the RC? Thanks! -Pat On Mon, Feb 15, 2016 at 6:50 PM, Michael Armbrust wrote: > I'm not going to be able to do anything until after the Spark Summit, but > I will kick off RC1 after that (end of week). Get your patches in

Re: Spark 1.6.1

2016-02-15 Thread Michael Armbrust
I'm not going to be able to do anything until after the Spark Summit, but I will kick off RC1 after that (end of week). Get your patches in before then! On Sat, Feb 13, 2016 at 4:57 PM, Jong Wook Kim wrote: > Is 1.6.1 going to be ready this week? I see that the two last

Re: Spark 1.6.1

2016-02-13 Thread Jong Wook Kim
Is 1.6.1 going to be ready this week? I see that the two last unresolved issues targeting 1.6.1 are fixed now . On 3 February 2016 at 08:16, Daniel Darabos < daniel.dara...@lynxanalytics.com> wrote: > > On

Re: Spark 1.6.1

2016-02-03 Thread Steve Loughran
abricks.com>> Cc: Hamel Kothari <hamelkoth...@gmail.com<mailto:hamelkoth...@gmail.com>>, Ted Yu <yuzhih...@gmail.com<mailto:yuzhih...@gmail.com>>, "dev@spark.apache.org<mailto:dev@spark.apache.org>" <dev@spark.apache.org<mailto:dev@

Re: Spark 1.6.1

2016-02-03 Thread Daniel Darabos
On Tue, Feb 2, 2016 at 7:10 PM, Michael Armbrust wrote: > What about the memory leak bug? >> https://issues.apache.org/jira/browse/SPARK-11293 >> Even after the memory rewrite in 1.6.0, it still happens in some cases. >> Will it be fixed for 1.6.1? >> > > I think we have

Re: Spark 1.6.1

2016-02-02 Thread Mingyu Kim
;, "dev@spark.apache.org" <dev@spark.apache.org>, Punya Biswal <pbis...@palantir.com>, Robert Kruszewski <robe...@palantir.com> Subject: Re: Spark 1.6.1 I'm waiting for a few last fixes to be merged. Hoping to cut an RC in the next few days. On Tue, Feb 2, 2016 a

Re: Spark 1.6.1

2016-02-02 Thread Mingyu Kim
i <hamelkoth...@gmail.com>, Ted Yu <yuzhih...@gmail.com>, "dev@spark.apache.org" <dev@spark.apache.org> Subject: Re: Spark 1.6.1 Hi Michael, What about the memory leak bug? https://issues.apache.org/jira/browse/SPARK-11293 <https://urldefense.proofpoint.com/v2/url?u=htt

Re: Spark 1.6.1

2016-02-02 Thread Michael Armbrust
ming along. Thanks! > > Mingyu > > From: Romi Kuntsman <r...@totango.com> > Date: Tuesday, February 2, 2016 at 3:16 AM > To: Michael Armbrust <mich...@databricks.com> > Cc: Hamel Kothari <hamelkoth...@gmail.com>, Ted Yu <yuzhih...@gmail.com>, > "dev@spa

Re: Spark 1.6.1

2016-02-02 Thread Michael Armbrust
> > What about the memory leak bug? > https://issues.apache.org/jira/browse/SPARK-11293 > Even after the memory rewrite in 1.6.0, it still happens in some cases. > Will it be fixed for 1.6.1? > I think we have enough issues queued up that I would not hold the release for that, but if there is a

Re: Spark 1.6.1

2016-02-02 Thread Romi Kuntsman
t;> >>> SPARK-12624 has been resolved. >>> According to Wenchen, SPARK-12783 is fixed in 1.6.0 release. >>> >>> Are there other blockers for Spark 1.6.1 ? >>> >>> Thanks >>> >>> On Wed, Jan 13, 2016 at 5:39 PM, Michael Armbrust &

Re: Spark 1.6.1

2016-02-01 Thread Michael Armbrust
. It should be fully backwards > compatible according to the Jackson folks. > > On Mon, Feb 1, 2016 at 10:29 AM Ted Yu <yuzhih...@gmail.com> wrote: > >> SPARK-12624 has been resolved. >> According to Wenchen, SPARK-12783 is fixed in 1.6.0 release. >> >> Are

Re: Spark 1.6.1

2016-02-01 Thread Hamel Kothari
g to Wenchen, SPARK-12783 is fixed in 1.6.0 release. > > Are there other blockers for Spark 1.6.1 ? > > Thanks > > On Wed, Jan 13, 2016 at 5:39 PM, Michael Armbrust <mich...@databricks.com> > wrote: > >> Hey All, >> >> While I'm not aware of any c

Re: Spark 1.6.1

2016-01-29 Thread Michael Armbrust
I think this is fixed in branch-1.6 already. If you can reproduce it there can you please open a JIRA and ping me? On Fri, Jan 29, 2016 at 12:16 PM, deenar < deenar.toras...@thinkreactive.co.uk> wrote: > Hi Michael > > The Dataset aggregators do not appear to support complex Spark-SQL types. I

RE: Spark 1.6.1

2016-01-25 Thread Ewan Leith
that, and you can embed the same code in your own packages, outside of the main Spark releases. Thanks, Ewan -Original Message- From: BrandonBradley [mailto:bradleytas...@gmail.com] Sent: 22 January 2016 14:29 To: dev@spark.apache.org Subject: Re: Spark 1.6.1 I'd like more complete

Re: Spark 1.6.1

2016-01-22 Thread BrandonBradley
I'd like more complete Postgres JDBC support for ArrayType before the next release. Some of them are still broken in 1.6.0. It would save me much time. Please see SPARK-12747 @ https://issues.apache.org/jira/browse/SPARK-12747 Cheers! Brandon Bradley -- View this message in context:

Spark 1.6.1

2016-01-13 Thread Michael Armbrust
Hey All, While I'm not aware of any critical issues with 1.6.0, there are several corner cases that users are hitting with the Dataset API that are fixed in branch-1.6. As such I'm considering a 1.6.1 release. At the moment there are only two critical issues targeted for 1.6.1: - SPARK-12624 -