Regarding the failure in
org.apache.spark.streaming.kafka.DirectKafkaStreamSuite","offset recovery
We have been seeing the very same problem with the IBM JDK for quite a long
time ( since at least July 2015 ).
It is intermittent and we had dismissed it as a testcase problem.
--
View this
I just created the following pull request ( against master but would like on
1.6.1 ) for the isolated classloader fix ( Spark-13648 )
https://github.com/apache/spark/pull/11495
--
View this message in context:
I have been testing 1.6.1RC1 using the IBM Java SDK.
I notice a problem ( with the org.apache.spark.sql.hive.client.VersionsSuite
tests ) after a recent Spark 1.6.1 change.
Pull request -
https://github.com/apache/spark/commit/f7898f9e2df131fa78200f6034508e74a78c2a44
The change introduced a
So if Spark does not support heterogeneous endianness clusters, should Spark
at least always support homogeneous endianess clusters ?
I ask because I just noticed
https://issues.apache.org/jira/browse/SPARK-12785 which appears to be
introducing a new feature designed for Little Endian only.
Considering Spark 2.x will run for 2 years, would moving up to Scala 2.12 (
pencilled in for Jan 2016 ) make any sense ? - although that would then
pre-req Java 8.
--
View this message in context:
So it appears the tests fail because of an SSLHandshakeException.
Tracing the failure I see:
3,0001,Using SSLEngineImpl.\0A
3,0001,\0AIs initial handshake: true\0A
3,0001,Ignoring unsupported cipher suite: SSL_RSA_WITH_DES_CBC_SHA for
TLSv1.2\0A
3,0001,No available cipher suite for TLSv1.2\0A
Nb. I did notice some test failures when I ran a quick test on the pull
request ( not sure if it is related - I haven't looked in any detail at the
cause ).
Failed tests:
SslChunkFetchIntegrationSuite>ChunkFetchIntegrationSuite.fetchBothChunks:201
expected:<[]> but was:<[0, 1]>
Searching shows several people hit this same NPE in AppClient.scala line 160
( perhaps because appID was null - could application had be stopped before
registered ?)
--
View this message in context:
I've had success building with maven ( 3.3.3 ) with:
Intellij 14.1.5
scala 2.10.4
openjdk 7 (1.7.0_79)
What OS/Platform are you on ?
--
View this message in context:
( e.g https://issues.apache.org/jira/browse/SPARK-7973) may be a result of this Scala issue.
I am new to the Spark community. Is there a preferred way to track the fact the Spark testcase CliSuite has a dependency on the above Scala issue ?
Tim Preece
10 matches
Mail list logo