[ 
https://issues.apache.org/jira/browse/SPARK-10909?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14942994#comment-14942994
 ] 

Kostas papageorgopoulos commented on SPARK-10909:
-------------------------------------------------

I will test it in the first chance and let you know if this patch worked ok

> Spark sql jdbc fails for Oracle NUMBER type columns
> ---------------------------------------------------
>
>                 Key: SPARK-10909
>                 URL: https://issues.apache.org/jira/browse/SPARK-10909
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.1
>         Environment: Dev
>            Reporter: Kostas papageorgopoulos
>            Priority: Minor
>              Labels: jdbc, newbie, sql
>
> When using spark sql to connect to Oracle and run a spark sql query i get the 
> following exception "requirement failed: Overflowed precision" This is 
> triggered when in the dbTable definition it is included an Oracle NUMBER 
> column
> {code}
>    SQLContext sqlContext = new SQLContext(sc);
>         Map<String, String> options = new HashMap<>();
>         options.put("driver", "oracle.jdbc.OracleDriver");
>         options.put("user", "USER");
>         options.put("password", "PASS");
>         options.put("url", "ORACLE CONNECTINO URL");
>         options.put("dbtable", "(select VARCHAR_COLUMN 
> ,TIMESTAMP_COLUMN,NUMBER_COLUMN from lsc_subscription_profiles)");
>         DataFrame jdbcDF = 
> sqlContext.read().format("jdbc").options(options).load();
>         jdbcDF.toJavaRDD().saveAsTextFile("hdfs://hdfshost:8020" + 
> "/path/to/write.bz2", BZip2Codec.class);
> {code}
> using driver 
> {code}
>      <groupId>com.oracle</groupId>
>             <artifactId>ojdbc6</artifactId>
>             <version>11.2.0.3.0</version>
>             <!--<scope>runtime</scope>-->
>         </dependency>
> {code}
> Using java sun java jdk 1.8.51 along with spring4
> The classpath of the junit run is 
> {code}
> /home/kostas/dev2/tools/jdk1.8.0_51/bin/java 
> -agentlib:jdwp=transport=dt_socket,address=127.0.0.1:42901,suspend=y,server=n 
> -ea -Duser.timezone=Africa/Cairo -Dfile.encoding=UTF-8 -classpath 
> /home/kostas/dev2/tools/idea-IU-141.178.9/lib/idea_rt.jar:/home/kostas/dev2/tools/idea-IU-141.178.9/plugins/junit/lib/junit-rt.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/jfxswt.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/deploy.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/charsets.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/rt.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/javaws.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/jce.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/resources.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/plugin.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/jfr.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/jsse.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/management-agent.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/ext/sunjce_provider.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/ext/sunec.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/ext/localedata.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/ext/jfxrt.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/ext/cldrdata.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/ext/nashorn.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/ext/zipfs.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/ext/sunpkcs11.jar:/home/kostas/dev2/tools/jdk1.8.0_51/jre/lib/ext/dnsns.jar:/home/kostas/dev2/projects/atlas_reporting/atlas-core/target/test-classes:/home/kostas/dev2/projects/atlas_reporting/atlas-core/target/classes:/home/kostas/.m2/repository/org/apache/spark/spark-core_2.10/1.5.0/spark-core_2.10-1.5.0.jar:/home/kostas/.m2/repository/org/apache/avro/avro-mapred/1.7.7/avro-mapred-1.7.7-hadoop2.jar:/home/kostas/.m2/repository/org/apache/avro/avro-ipc/1.7.7/avro-ipc-1.7.7.jar:/home/kostas/.m2/repository/org/apache/avro/avro-ipc/1.7.7/avro-ipc-1.7.7-tests.jar:/home/kostas/.m2/repository/com/twitter/chill_2.10/0.5.0/chill_2.10-0.5.0.jar:/home/kostas/.m2/repository/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar:/home/kostas/.m2/repository/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar:/home/kostas/.m2/repository/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar:/home/kostas/.m2/repository/org/objenesis/objenesis/1.2/objenesis-1.2.jar:/home/kostas/.m2/repository/com/twitter/chill-java/0.5.0/chill-java-0.5.0.jar:/home/kostas/.m2/repository/org/apache/spark/spark-launcher_2.10/1.5.0/spark-launcher_2.10-1.5.0.jar:/home/kostas/.m2/repository/org/apache/spark/spark-network-common_2.10/1.5.0/spark-network-common_2.10-1.5.0.jar:/home/kostas/.m2/repository/org/apache/spark/spark-network-shuffle_2.10/1.5.0/spark-network-shuffle_2.10-1.5.0.jar:/home/kostas/.m2/repository/org/apache/spark/spark-unsafe_2.10/1.5.0/spark-unsafe_2.10-1.5.0.jar:/home/kostas/.m2/repository/net/java/dev/jets3t/jets3t/0.7.1/jets3t-0.7.1.jar:/home/kostas/.m2/repository/org/apache/curator/curator-recipes/2.4.0/curator-recipes-2.4.0.jar:/home/kostas/.m2/repository/org/apache/curator/curator-framework/2.4.0/curator-framework-2.4.0.jar:/home/kostas/.m2/repository/org/eclipse/jetty/orbit/javax.servlet/3.0.0.v201112011016/javax.servlet-3.0.0.v201112011016.jar:/home/kostas/.m2/repository/org/apache/commons/commons-lang3/3.3.2/commons-lang3-3.3.2.jar:/home/kostas/.m2/repository/org/apache/commons/commons-math3/3.4.1/commons-math3-3.4.1.jar:/home/kostas/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/home/kostas/.m2/repository/org/slf4j/slf4j-api/1.7.10/slf4j-api-1.7.10.jar:/home/kostas/.m2/repository/com/ning/compress-lzf/1.0.3/compress-lzf-1.0.3.jar:/home/kostas/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.7/snappy-java-1.1.1.7.jar:/home/kostas/.m2/repository/net/jpountz/lz4/lz4/1.3.0/lz4-1.3.0.jar:/home/kostas/.m2/repository/org/roaringbitmap/RoaringBitmap/0.4.5/RoaringBitmap-0.4.5.jar:/home/kostas/.m2/repository/commons-net/commons-net/2.2/commons-net-2.2.jar:/home/kostas/.m2/repository/com/typesafe/akka/akka-remote_2.10/2.3.11/akka-remote_2.10-2.3.11.jar:/home/kostas/.m2/repository/com/typesafe/akka/akka-actor_2.10/2.3.11/akka-actor_2.10-2.3.11.jar:/home/kostas/.m2/repository/com/typesafe/config/1.2.1/config-1.2.1.jar:/home/kostas/.m2/repository/io/netty/netty/3.8.0.Final/netty-3.8.0.Final.jar:/home/kostas/.m2/repository/org/uncommons/maths/uncommons-maths/1.2.2a/uncommons-maths-1.2.2a.jar:/home/kostas/.m2/repository/com/typesafe/akka/akka-slf4j_2.10/2.3.11/akka-slf4j_2.10-2.3.11.jar:/home/kostas/.m2/repository/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar:/home/kostas/.m2/repository/org/json4s/json4s-jackson_2.10/3.2.10/json4s-jackson_2.10-3.2.10.jar:/home/kostas/.m2/repository/org/json4s/json4s-core_2.10/3.2.10/json4s-core_2.10-3.2.10.jar:/home/kostas/.m2/repository/org/json4s/json4s-ast_2.10/3.2.10/json4s-ast_2.10-3.2.10.jar:/home/kostas/.m2/repository/org/scala-lang/scalap/2.10.0/scalap-2.10.0.jar:/home/kostas/.m2/repository/org/scala-lang/scala-compiler/2.10.0/scala-compiler-2.10.0.jar:/home/kostas/.m2/repository/com/sun/jersey/jersey-server/1.9/jersey-server-1.9.jar:/home/kostas/.m2/repository/asm/asm/3.1/asm-3.1.jar:/home/kostas/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/home/kostas/.m2/repository/org/apache/mesos/mesos/0.21.1/mesos-0.21.1-shaded-protobuf.jar:/home/kostas/.m2/repository/io/netty/netty-all/4.0.29.Final/netty-all-4.0.29.Final.jar:/home/kostas/.m2/repository/com/clearspring/analytics/stream/2.7.0/stream-2.7.0.jar:/home/kostas/.m2/repository/io/dropwizard/metrics/metrics-core/3.1.2/metrics-core-3.1.2.jar:/home/kostas/.m2/repository/io/dropwizard/metrics/metrics-jvm/3.1.2/metrics-jvm-3.1.2.jar:/home/kostas/.m2/repository/io/dropwizard/metrics/metrics-json/3.1.2/metrics-json-3.1.2.jar:/home/kostas/.m2/repository/io/dropwizard/metrics/metrics-graphite/3.1.2/metrics-graphite-3.1.2.jar:/home/kostas/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.4.4/jackson-databind-2.4.4.jar:/home/kostas/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.4.0/jackson-annotations-2.4.0.jar:/home/kostas/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.4.4/jackson-core-2.4.4.jar:/home/kostas/.m2/repository/com/fasterxml/jackson/module/jackson-module-scala_2.10/2.4.4/jackson-module-scala_2.10-2.4.4.jar:/home/kostas/.m2/repository/org/scala-lang/scala-reflect/2.10.4/scala-reflect-2.10.4.jar:/home/kostas/.m2/repository/com/thoughtworks/paranamer/paranamer/2.6/paranamer-2.6.jar:/home/kostas/.m2/repository/org/apache/ivy/ivy/2.4.0/ivy-2.4.0.jar:/home/kostas/.m2/repository/oro/oro/2.0.8/oro-2.0.8.jar:/home/kostas/.m2/repository/org/tachyonproject/tachyon-client/0.7.1/tachyon-client-0.7.1.jar:/home/kostas/.m2/repository/org/tachyonproject/tachyon-underfs-hdfs/0.7.1/tachyon-underfs-hdfs-0.7.1.jar:/home/kostas/.m2/repository/org/tachyonproject/tachyon-underfs-local/0.7.1/tachyon-underfs-local-0.7.1.jar:/home/kostas/.m2/repository/net/razorvine/pyrolite/4.4/pyrolite-4.4.jar:/home/kostas/.m2/repository/net/sf/py4j/py4j/0.8.2.1/py4j-0.8.2.1.jar:/home/kostas/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/home/kostas/.m2/repository/org/apache/spark/spark-sql_2.10/1.5.0/spark-sql_2.10-1.5.0.jar:/home/kostas/.m2/repository/org/apache/spark/spark-catalyst_2.10/1.5.0/spark-catalyst_2.10-1.5.0.jar:/home/kostas/.m2/repository/org/codehaus/janino/janino/2.7.8/janino-2.7.8.jar:/home/kostas/.m2/repository/org/codehaus/janino/commons-compiler/2.7.8/commons-compiler-2.7.8.jar:/home/kostas/.m2/repository/org/apache/parquet/parquet-column/1.7.0/parquet-column-1.7.0.jar:/home/kostas/.m2/repository/org/apache/parquet/parquet-common/1.7.0/parquet-common-1.7.0.jar:/home/kostas/.m2/repository/org/apache/parquet/parquet-encoding/1.7.0/parquet-encoding-1.7.0.jar:/home/kostas/.m2/repository/org/apache/parquet/parquet-generator/1.7.0/parquet-generator-1.7.0.jar:/home/kostas/.m2/repository/org/apache/parquet/parquet-hadoop/1.7.0/parquet-hadoop-1.7.0.jar:/home/kostas/.m2/repository/org/apache/parquet/parquet-format/2.3.0-incubating/parquet-format-2.3.0-incubating.jar:/home/kostas/.m2/repository/org/apache/parquet/parquet-jackson/1.7.0/parquet-jackson-1.7.0.jar:/home/kostas/.m2/repository/commons-logging/commons-logging/1.2/commons-logging-1.2.jar:/home/kostas/.m2/repository/org/apache/commons/commons-exec/1.3/commons-exec-1.3.jar:/home/kostas/.m2/repository/org/apache/hadoop/hadoop-common/2.7.1/hadoop-common-2.7.1.jar:/home/kostas/.m2/repository/org/apache/hadoop/hadoop-annotations/2.7.1/hadoop-annotations-2.7.1.jar:/home/kostas/dev2/tools/jdk1.8.0_51/lib/tools.jar:/home/kostas/.m2/repository/com/google/guava/guava/11.0.2/guava-11.0.2.jar:/home/kostas/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/home/kostas/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/home/kostas/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/home/kostas/.m2/repository/commons-codec/commons-codec/1.4/commons-codec-1.4.jar:/home/kostas/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/home/kostas/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/home/kostas/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar:/home/kostas/.m2/repository/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar:/home/kostas/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/home/kostas/.m2/repository/javax/servlet/jsp/jsp-api/2.1/jsp-api-2.1.jar:/home/kostas/.m2/repository/com/sun/jersey/jersey-json/1.9/jersey-json-1.9.jar:/home/kostas/.m2/repository/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar:/home/kostas/.m2/repository/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar:/home/kostas/.m2/repository/org/codehaus/jackson/jackson-jaxrs/1.8.3/jackson-jaxrs-1.8.3.jar:/home/kostas/.m2/repository/org/codehaus/jackson/jackson-xc/1.8.3/jackson-xc-1.8.3.jar:/home/kostas/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/home/kostas/.m2/repository/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/home/kostas/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/home/kostas/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/home/kostas/.m2/repository/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar:/home/kostas/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar:/home/kostas/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/home/kostas/.m2/repository/org/apache/avro/avro/1.7.4/avro-1.7.4.jar:/home/kostas/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/home/kostas/.m2/repository/com/google/code/gson/gson/2.2.4/gson-2.2.4.jar:/home/kostas/.m2/repository/org/apache/hadoop/hadoop-auth/2.7.1/hadoop-auth-2.7.1.jar:/home/kostas/.m2/repository/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar:/home/kostas/.m2/repository/org/apache/httpcomponents/httpcore/4.2.4/httpcore-4.2.4.jar:/home/kostas/.m2/repository/org/apache/directory/server/apacheds-kerberos-codec/2.0.0-M15/apacheds-kerberos-codec-2.0.0-M15.jar:/home/kostas/.m2/repository/org/apache/directory/server/apacheds-i18n/2.0.0-M15/apacheds-i18n-2.0.0-M15.jar:/home/kostas/.m2/repository/org/apache/directory/api/api-asn1-api/1.0.0-M20/api-asn1-api-1.0.0-M20.jar:/home/kostas/.m2/repository/org/apache/directory/api/api-util/1.0.0-M20/api-util-1.0.0-M20.jar:/home/kostas/.m2/repository/com/jcraft/jsch/0.1.42/jsch-0.1.42.jar:/home/kostas/.m2/repository/org/apache/curator/curator-client/2.7.1/curator-client-2.7.1.jar:/home/kostas/.m2/repository/org/apache/htrace/htrace-core/3.1.0-incubating/htrace-core-3.1.0-incubating.jar:/home/kostas/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6.jar:/home/kostas/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/home/kostas/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:/home/kostas/.m2/repository/org/apache/hadoop/hadoop-hdfs/2.7.1/hadoop-hdfs-2.7.1.jar:/home/kostas/.m2/repository/commons-daemon/commons-daemon/1.0.13/commons-daemon-1.0.13.jar:/home/kostas/.m2/repository/xerces/xercesImpl/2.9.1/xercesImpl-2.9.1.jar:/home/kostas/.m2/repository/xml-apis/xml-apis/1.3.04/xml-apis-1.3.04.jar:/home/kostas/.m2/repository/org/fusesource/leveldbjni/leveldbjni-all/1.8/leveldbjni-all-1.8.jar:/home/kostas/.m2/repository/org/apache/hadoop/hadoop-client/2.7.1/hadoop-client-2.7.1.jar:/home/kostas/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-app/2.7.1/hadoop-mapreduce-client-app-2.7.1.jar:/home/kostas/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-common/2.7.1/hadoop-mapreduce-client-common-2.7.1.jar:/home/kostas/.m2/repository/org/apache/hadoop/hadoop-yarn-client/2.7.1/hadoop-yarn-client-2.7.1.jar:/home/kostas/.m2/repository/org/apache/hadoop/hadoop-yarn-server-common/2.7.1/hadoop-yarn-server-common-2.7.1.jar:/home/kostas/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-shuffle/2.7.1/hadoop-mapreduce-client-shuffle-2.7.1.jar:/home/kostas/.m2/repository/org/apache/hadoop/hadoop-yarn-api/2.7.1/hadoop-yarn-api-2.7.1.jar:/home/kostas/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.7.1/hadoop-mapreduce-client-core-2.7.1.jar:/home/kostas/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-jobclient/2.7.1/hadoop-mapreduce-client-jobclient-2.7.1.jar:/home/kostas/.m2/repository/org/elasticsearch/elasticsearch/1.7.1/elasticsearch-1.7.1.jar:/home/kostas/.m2/repository/org/apache/lucene/lucene-core/4.10.4/lucene-core-4.10.4.jar:/home/kostas/.m2/repository/org/apache/lucene/lucene-analyzers-common/4.10.4/lucene-analyzers-common-4.10.4.jar:/home/kostas/.m2/repository/org/apache/lucene/lucene-queries/4.10.4/lucene-queries-4.10.4.jar:/home/kostas/.m2/repository/org/apache/lucene/lucene-memory/4.10.4/lucene-memory-4.10.4.jar:/home/kostas/.m2/repository/org/apache/lucene/lucene-highlighter/4.10.4/lucene-highlighter-4.10.4.jar:/home/kostas/.m2/repository/org/apache/lucene/lucene-queryparser/4.10.4/lucene-queryparser-4.10.4.jar:/home/kostas/.m2/repository/org/apache/lucene/lucene-sandbox/4.10.4/lucene-sandbox-4.10.4.jar:/home/kostas/.m2/repository/org/apache/lucene/lucene-suggest/4.10.4/lucene-suggest-4.10.4.jar:/home/kostas/.m2/repository/org/apache/lucene/lucene-misc/4.10.4/lucene-misc-4.10.4.jar:/home/kostas/.m2/repository/org/apache/lucene/lucene-join/4.10.4/lucene-join-4.10.4.jar:/home/kostas/.m2/repository/org/apache/lucene/lucene-grouping/4.10.4/lucene-grouping-4.10.4.jar:/home/kostas/.m2/repository/org/apache/lucene/lucene-spatial/4.10.4/lucene-spatial-4.10.4.jar:/home/kostas/.m2/repository/com/spatial4j/spatial4j/0.4.1/spatial4j-0.4.1.jar:/home/kostas/.m2/repository/org/antlr/antlr-runtime/3.5/antlr-runtime-3.5.jar:/home/kostas/.m2/repository/org/ow2/asm/asm/4.1/asm-4.1.jar:/home/kostas/.m2/repository/org/ow2/asm/asm-commons/4.1/asm-commons-4.1.jar:/home/kostas/.m2/repository/org/yaml/snakeyaml/1.12/snakeyaml-1.12.jar:/home/kostas/.m2/repository/org/elasticsearch/elasticsearch-hadoop/2.2.0-m1/elasticsearch-hadoop-2.2.0-m1.jar:/home/kostas/.m2/repository/cascading/cascading-local/2.6.3/cascading-local-2.6.3.jar:/home/kostas/.m2/repository/cascading/cascading-core/2.6.3/cascading-core-2.6.3.jar:/home/kostas/.m2/repository/riffle/riffle/0.1-dev/riffle-0.1-dev.jar:/home/kostas/.m2/repository/thirdparty/jgrapht-jdk1.6/0.8.1/jgrapht-jdk1.6-0.8.1.jar:/home/kostas/.m2/repository/cascading/cascading-hadoop/2.6.3/cascading-hadoop-2.6.3.jar:/home/kostas/.m2/repository/joda-time/joda-time/2.8.2/joda-time-2.8.2.jar:/home/kostas/.m2/repository/org/apache/commons/commons-dbcp2/2.1.1/commons-dbcp2-2.1.1.jar:/home/kostas/.m2/repository/org/apache/commons/commons-pool2/2.4.2/commons-pool2-2.4.2.jar:/home/kostas/.m2/repository/org/json/json/20141113/json-20141113.jar:/home/kostas/.m2/repository/javax/servlet/javax.servlet-api/3.0.1/javax.servlet-api-3.0.1.jar:/home/kostas/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/home/kostas/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/home/kostas/.m2/repository/junit/junit/4.12/junit-4.12.jar:/home/kostas/.m2/repository/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar:/home/kostas/.m2/repository/org/springframework/spring-core/4.2.1.RELEASE/spring-core-4.2.1.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/spring-context/4.2.1.RELEASE/spring-context-4.2.1.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/spring-aop/4.2.1.RELEASE/spring-aop-4.2.1.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/spring-beans/4.2.1.RELEASE/spring-beans-4.2.1.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/spring-expression/4.2.1.RELEASE/spring-expression-4.2.1.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/spring-test/4.2.1.RELEASE/spring-test-4.2.1.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/security/spring-security-web/4.0.2.RELEASE/spring-security-web-4.0.2.RELEASE.jar:/home/kostas/.m2/repository/aopalliance/aopalliance/1.0/aopalliance-1.0.jar:/home/kostas/.m2/repository/org/springframework/security/spring-security-core/4.0.2.RELEASE/spring-security-core-4.0.2.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/spring-web/4.1.6.RELEASE/spring-web-4.1.6.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/spring-context-support/4.2.1.RELEASE/spring-context-support-4.2.1.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/spring-jdbc/4.2.1.RELEASE/spring-jdbc-4.2.1.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/spring-tx/4.2.1.RELEASE/spring-tx-4.2.1.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/data/spring-data-hadoop/2.2.0.RELEASE/spring-data-hadoop-2.2.0.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/data/spring-data-hadoop-hbase/2.2.0.RELEASE/spring-data-hadoop-hbase-2.2.0.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/data/spring-data-hadoop-pig/2.2.0.RELEASE/spring-data-hadoop-pig-2.2.0.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/batch/spring-batch-core/3.0.4.RELEASE/spring-batch-core-3.0.4.RELEASE.jar:/home/kostas/.m2/repository/com/ibm/jbatch/com.ibm.jbatch-tck-spi/1.0/com.ibm.jbatch-tck-spi-1.0.jar:/home/kostas/.m2/repository/javax/batch/javax.batch-api/1.0/javax.batch-api-1.0.jar:/home/kostas/.m2/repository/com/thoughtworks/xstream/xstream/1.4.7/xstream-1.4.7.jar:/home/kostas/.m2/repository/xmlpull/xmlpull/1.1.3.1/xmlpull-1.1.3.1.jar:/home/kostas/.m2/repository/xpp3/xpp3_min/1.1.4c/xpp3_min-1.1.4c.jar:/home/kostas/.m2/repository/org/springframework/batch/spring-batch-infrastructure/3.0.4.RELEASE/spring-batch-infrastructure-3.0.4.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/retry/spring-retry/1.1.0.RELEASE/spring-retry-1.1.0.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/data/spring-data-hadoop-batch/2.2.0.RELEASE/spring-data-hadoop-batch-2.2.0.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/data/spring-data-hadoop-hive/2.2.0.RELEASE/spring-data-hadoop-hive-2.2.0.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/data/spring-data-hadoop-core/2.2.0.RELEASE/spring-data-hadoop-core-2.2.0.RELEASE.jar:/home/kostas/.m2/repository/org/springframework/data/spring-data-hadoop-store/2.2.0.RELEASE/spring-data-hadoop-store-2.2.0.RELEASE.jar:/home/kostas/.m2/repository/org/apache/hadoop/hadoop-yarn-common/2.6.0/hadoop-yarn-common-2.6.0.jar:/home/kostas/.m2/repository/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/home/kostas/.m2/repository/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar:/home/kostas/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar:/home/kostas/.m2/repository/com/sun/jersey/jersey-client/1.9/jersey-client-1.9.jar:/home/kostas/.m2/repository/com/google/inject/extensions/guice-servlet/3.0/guice-servlet-3.0.jar:/home/kostas/.m2/repository/com/google/inject/guice/3.0/guice-3.0.jar:/home/kostas/.m2/repository/javax/inject/javax.inject/1/javax.inject-1.jar:/home/kostas/.m2/repository/com/sun/jersey/contribs/jersey-guice/1.9/jersey-guice-1.9.jar:/home/kostas/.m2/repository/org/apache/hadoop/hadoop-distcp/2.6.0/hadoop-distcp-2.6.0.jar:/home/kostas/.m2/repository/org/springframework/spring-messaging/4.1.6.RELEASE/spring-messaging-4.1.6.RELEASE.jar:/home/kostas/.m2/repository/org/kitesdk/kite-data-core/1.0.0/kite-data-core-1.0.0.jar:/home/kostas/.m2/repository/org/kitesdk/kite-hadoop-compatibility/1.0.0/kite-hadoop-compatibility-1.0.0.jar:/home/kostas/.m2/repository/com/twitter/parquet-avro/1.4.1/parquet-avro-1.4.1.jar:/home/kostas/.m2/repository/com/twitter/parquet-column/1.4.1/parquet-column-1.4.1.jar:/home/kostas/.m2/repository/com/twitter/parquet-common/1.4.1/parquet-common-1.4.1.jar:/home/kostas/.m2/repository/com/twitter/parquet-encoding/1.4.1/parquet-encoding-1.4.1.jar:/home/kostas/.m2/repository/com/twitter/parquet-generator/1.4.1/parquet-generator-1.4.1.jar:/home/kostas/.m2/repository/com/twitter/parquet-hadoop/1.4.1/parquet-hadoop-1.4.1.jar:/home/kostas/.m2/repository/com/twitter/parquet-jackson/1.4.1/parquet-jackson-1.4.1.jar:/home/kostas/.m2/repository/com/twitter/parquet-format/2.0.0/parquet-format-2.0.0.jar:/home/kostas/.m2/repository/net/sf/opencsv/opencsv/2.3/opencsv-2.3.jar:/home/kostas/.m2/repository/org/apache/commons/commons-jexl/2.1.1/commons-jexl-2.1.1.jar:/home/kostas/.m2/repository/org/apache/hadoop/hadoop-streaming/2.6.0/hadoop-streaming-2.6.0.jar:/home/kostas/.m2/repository/org/mybatis/mybatis-spring/1.2.3/mybatis-spring-1.2.3.jar:/home/kostas/.m2/repository/org/mybatis/mybatis/3.3.0/mybatis-3.3.0.jar:/home/kostas/.m2/repository/org/mvel/mvel2/2.2.6.Final/mvel2-2.2.6.Final.jar:/home/kostas/.m2/repository/org/aspectj/aspectjrt/1.8.7/aspectjrt-1.8.7.jar:/home/kostas/.m2/repository/org/aspectj/aspectjweaver/1.8.7/aspectjweaver-1.8.7.jar:/home/kostas/.m2/repository/com/zaxxer/HikariCP/2.4.1/HikariCP-2.4.1.jar:/home/kostas/.m2/repository/com/oracle/ojdbc6/11.2.0.3.0/ojdbc6-11.2.0.3.0.jar:/home/kostas/.m2/repository/net/sf/ehcache/ehcache/2.10.0/ehcache-2.10.0.jar:/home/kostas/.m2/repository/org/quartz-scheduler/quartz/2.2.1/quartz-2.2.1.jar:/home/kostas/.m2/repository/c3p0/c3p0/0.9.1.1/c3p0-0.9.1.1.jar
>  com.intellij.rt.execution.junit.JUnitStarter -ideVersion5 
> {code}
>  
> The stacktrace 
> {code}
> g.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in 
> stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 
> (TID 3, 10.130.35.52): java.lang.IllegalArgumentException: requirement 
> failed: Overflowed precision
>       at scala.Predef$.require(Predef.scala:233)
>       at org.apache.spark.sql.types.Decimal.set(Decimal.scala:111)
>       at org.apache.spark.sql.types.Decimal$.apply(Decimal.scala:335)
>       at 
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anon$1.getNext(JDBCRDD.scala:406)
>       at 
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anon$1.hasNext(JDBCRDD.scala:472)
>       at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>       at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$13$$anonfun$apply$6.apply$mcV$sp(PairRDDFunctions.scala:1108)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$13$$anonfun$apply$6.apply(PairRDDFunctions.scala:1108)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$13$$anonfun$apply$6.apply(PairRDDFunctions.scala:1108)
>       at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1206)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$13.apply(PairRDDFunctions.scala:1116)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$13.apply(PairRDDFunctions.scala:1095)
>       at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
>       at org.apache.spark.scheduler.Task.run(Task.scala:88)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>       at java.lang.Thread.run(Thread.java:745)
> Driver stacktrace:
>       at 
> org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1280)
>       at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1268)
>       at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1267)
>       at 
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>       at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>       at 
> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1267)
>       at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
>       at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
>       at scala.Option.foreach(Option.scala:236)
>       at 
> org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697)
>       at 
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1493)
>       at 
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1455)
>       at 
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1444)
>       at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
>       at 
> org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567)
>       at org.apache.spark.SparkContext.runJob(SparkContext.scala:1813)
>       at org.apache.spark.SparkContext.runJob(SparkContext.scala:1826)
>       at org.apache.spark.SparkContext.runJob(SparkContext.scala:1903)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1124)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1.apply(PairRDDFunctions.scala:1065)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1.apply(PairRDDFunctions.scala:1065)
>       at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
>       at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
>       at org.apache.spark.rdd.RDD.withScope(RDD.scala:306)
>       at 
> org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopDataset(PairRDDFunctions.scala:1065)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply$mcV$sp(PairRDDFunctions.scala:989)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply(PairRDDFunctions.scala:965)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$4.apply(PairRDDFunctions.scala:965)
>       at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
>       at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
>       at org.apache.spark.rdd.RDD.withScope(RDD.scala:306)
>       at 
> org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopFile(PairRDDFunctions.scala:965)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$3.apply$mcV$sp(PairRDDFunctions.scala:951)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$3.apply(PairRDDFunctions.scala:951)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$3.apply(PairRDDFunctions.scala:951)
>       at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
>       at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
>       at org.apache.spark.rdd.RDD.withScope(RDD.scala:306)
>       at 
> org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopFile(PairRDDFunctions.scala:950)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$2.apply$mcV$sp(PairRDDFunctions.scala:909)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$2.apply(PairRDDFunctions.scala:907)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopFile$2.apply(PairRDDFunctions.scala:907)
>       at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
>       at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
>       at org.apache.spark.rdd.RDD.withScope(RDD.scala:306)
>       at 
> org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopFile(PairRDDFunctions.scala:907)
>       at 
> org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$2.apply$mcV$sp(RDD.scala:1444)
>       at 
> org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$2.apply(RDD.scala:1432)
>       at 
> org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$2.apply(RDD.scala:1432)
>       at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
>       at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
>       at org.apache.spark.rdd.RDD.withScope(RDD.scala:306)
>       at org.apache.spark.rdd.RDD.saveAsTextFile(RDD.scala:1432)
>       at 
> org.apache.spark.api.java.JavaRDDLike$class.saveAsTextFile(JavaRDDLike.scala:530)
>       at 
> org.apache.spark.api.java.AbstractJavaRDDLike.saveAsTextFile(JavaRDDLike.scala:47)
>       at 
> velti.tech.reporting.test.commons.spark.jdbc.TestRunSparkJdbcJob.testRunSqlDirectlyOnJDBC(TestRunSparkJdbcJob.java:56)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
>       at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>       at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
>       at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>       at 
> org.springframework.test.context.junit4.statements.RunBeforeTestMethodCallbacks.evaluate(RunBeforeTestMethodCallbacks.java:74)
>       at 
> org.springframework.test.context.junit4.statements.RunAfterTestMethodCallbacks.evaluate(RunAfterTestMethodCallbacks.java:85)
>       at 
> org.springframework.test.context.junit4.statements.SpringRepeat.evaluate(SpringRepeat.java:86)
>       at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
>       at 
> org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:241)
>       at 
> org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:87)
>       at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
>       at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
>       at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
>       at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
>       at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
>       at 
> org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)
>       at 
> org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:70)
>       at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
>       at 
> org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:180)
>       at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>       at 
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:78)
>       at 
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:212)
>       at 
> com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:68)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
> Caused by: java.lang.IllegalArgumentException: requirement failed: Overflowed 
> precision
>       at scala.Predef$.require(Predef.scala:233)
>       at org.apache.spark.sql.types.Decimal.set(Decimal.scala:111)
>       at org.apache.spark.sql.types.Decimal$.apply(Decimal.scala:335)
>       at 
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anon$1.getNext(JDBCRDD.scala:406)
>       at 
> org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anon$1.hasNext(JDBCRDD.scala:472)
>       at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>       at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$13$$anonfun$apply$6.apply$mcV$sp(PairRDDFunctions.scala:1108)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$13$$anonfun$apply$6.apply(PairRDDFunctions.scala:1108)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$13$$anonfun$apply$6.apply(PairRDDFunctions.scala:1108)
>       at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1206)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$13.apply(PairRDDFunctions.scala:1116)
>       at 
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$13.apply(PairRDDFunctions.scala:1095)
>       at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
>       at org.apache.spark.scheduler.Task.run(Task.scala:88)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>       at java.lang.Thread.run(Thread.java:745)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to