Re: [VOTE] Apache Spark 2.1.0 (RC2)

2016-12-13 Thread Reynold Xin
I'm going to -1 this myself: https://issues.apache.org/jira/browse/ SPARK-18856 On Thu, Dec 8, 2016 at 12:39 AM, Reynold Xin wrote: > Please vote on releasing the following candidate as Apache Spark version > 2.1.0. The

Re: [VOTE] Apache Spark 2.1.0 (RC2)

2016-12-13 Thread Adam Roberts
d in RC3 we'd also have the [SPARK-18091] fix included to prevent a test's generated code exceeding the 64k constant pool size limit. From: akchin <akc...@us.ibm.com> To: dev@spark.apache.org Date: 13/12/2016 19:51 Subject: Re: [VOTE] Apache Spark 2.1.0 (RC2) Hello, I a

Re: [VOTE] Apache Spark 2.1.0 (RC2)

2016-12-13 Thread akchin
Hello, I am seeing this error as well except during "define case class and create Dataset together with paste mode *** FAILED ***" Starts throwing OOM and GC errors after running for several minutes. - Alan Chin IBM Spark Technology Center -- View this message in context:

Re: [VOTE] Apache Spark 2.1.0 (RC2)

2016-12-12 Thread Marcelo Vanzin
Another failing test is "ReplSuite:should clone and clean line object in ClosureCleaner". It never passes for me, just keeps spinning until the JVM eventually starts throwing OOM errors. Anyone seeing that? On Thu, Dec 8, 2016 at 12:39 AM, Reynold Xin wrote: > Please vote on

Re: [VOTE] Apache Spark 2.1.0 (RC2)

2016-12-12 Thread Yin Huai
-1 I hit https://issues.apache.org/jira/browse/SPARK-18816, which prevents executor page from showing the log links if an application does not have executors initially. On Mon, Dec 12, 2016 at 3:02 PM, Marcelo Vanzin wrote: > Actually this is not a simple pom change. The

Re: [VOTE] Apache Spark 2.1.0 (RC2)

2016-12-12 Thread Marcelo Vanzin
Actually this is not a simple pom change. The code in UDFRegistration.scala calls this method: if (returnType == null) { returnType = JavaTypeInference.inferDataType(TypeToken.of(udfReturnType))._1 } Because we shade guava, it's generally not very safe to call

Re: [VOTE] Apache Spark 2.1.0 (RC2)

2016-12-12 Thread Marcelo Vanzin
I'm running into this when building / testing on 1.7 (haven't tried 1.8): udf3Test(test.org.apache.spark.sql.JavaUDFSuite) Time elapsed: 0.079 sec <<< ERROR! java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.JavaTypeInference$.inferDataType(Lcom/google/common/reflect/TypeToken;)Lsc

Re: [VOTE] Apache Spark 2.1.0 (RC2)

2016-12-09 Thread Sean Owen
Sure, it's only an issue insofar as it may be a flaky test. If it's fixable or disable-able for a possible next RC that could be helpful. On Sat, Dec 10, 2016 at 2:09 AM Shixiong(Ryan) Zhu wrote: > Sean, "stress test for failOnDataLoss=false" is because Kafka consumer >

Re: [VOTE] Apache Spark 2.1.0 (RC2)

2016-12-09 Thread Cody Koeninger
Agree that frequent topic deletion is not a very Kafka-esque thing to do On Fri, Dec 9, 2016 at 12:09 PM, Shixiong(Ryan) Zhu wrote: > Sean, "stress test for failOnDataLoss=false" is because Kafka consumer may > be thrown NPE when a topic is deleted. I added some logic to

Re: [VOTE] Apache Spark 2.1.0 (RC2)

2016-12-09 Thread Shixiong(Ryan) Zhu
Sean, "stress test for failOnDataLoss=false" is because Kafka consumer may be thrown NPE when a topic is deleted. I added some logic to retry on such failure, however, it may still fail when topic deletion is too frequent (the stress test). Just reopened

Re: [VOTE] Apache Spark 2.1.0 (RC2)

2016-12-09 Thread Sean Owen
As usual, the sigs / hashes are fine and licenses look fine. I am still seeing some test failures. A few I've seen over time and aren't repeatable, but a few seem persistent. ANyone else observed these? I'm on Ubuntu 16 / Java 8 building for -Pyarn -Phadoop-2.7 -Phive If anyone can confirm I'll

Re: [VOTE] Apache Spark 2.1.0 (RC2)

2016-12-09 Thread Reynold Xin
I uploaded a new one: https://repository.apache.org/content/repositories/orgapachespark-1219/ On Thu, Dec 8, 2016 at 11:42 PM, Prashant Sharma wrote: > I am getting 404 for Link https://repository.apache.org/content/ > repositories/orgapachespark-1217. > > --Prashant > >

Re: [VOTE] Apache Spark 2.1.0 (RC2)

2016-12-08 Thread Prashant Sharma
I am getting 404 for Link https://repository.apache.org/content/repositories/orgapachespark-1217. --Prashant On Fri, Dec 9, 2016 at 10:43 AM, Michael Allman wrote: > I believe https://github.com/apache/spark/pull/16122 needs to be included > in Spark 2.1. It's a simple

Re: [VOTE] Apache Spark 2.1.0 (RC2)

2016-12-08 Thread Michael Allman
I believe https://github.com/apache/spark/pull/16122 needs to be included in Spark 2.1. It's a simple bug fix to some functionality that is introduced in 2.1. Unfortunately, it's been manually verified only. There's no unit test that covers it, and

Re: [VOTE] Apache Spark 2.1.0 (RC2)

2016-12-08 Thread Shivaram Venkataraman
+0 I am not sure how much of a problem this is but the pip packaging seems to have changed the size of the hadoop-2.7 artifact. As you can see in http://people.apache.org/~pwendell/spark-releases/spark-2.1.0-rc2-bin/, the Hadoop 2.7 build is 359M almost double the size of the other Hadoop