[jira] [Commented] (SPARK-10648) Spark-SQL JDBC fails to set a default precision and scale when they are not defined in an oracle schema.
[ https://issues.apache.org/jira/browse/SPARK-10648?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14991953#comment-14991953 ] Apache Spark commented on SPARK-10648: -- User 'yhuai' has created a pull request for this issue: https://github.com/apache/spark/pull/9498 > Spark-SQL JDBC fails to set a default precision and scale when they are not > defined in an oracle schema. > > > Key: SPARK-10648 > URL: https://issues.apache.org/jira/browse/SPARK-10648 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.5.0 > Environment: using oracle 11g, ojdbc7.jar >Reporter: Travis Hegner > > Using oracle 11g as a datasource with ojdbc7.jar. When importing data into a > scala app, I am getting an exception "Overflowed precision". Some times I > would get the exception "Unscaled value too large for precision". > This issue likely affects older versions as well, but this was the version I > verified it on. > I narrowed it down to the fact that the schema detection system was trying to > set the precision to 0, and the scale to -127. > I have a proposed pull request to follow. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-10648) Spark-SQL JDBC fails to set a default precision and scale when they are not defined in an oracle schema.
[ https://issues.apache.org/jira/browse/SPARK-10648?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14991781#comment-14991781 ] Apache Spark commented on SPARK-10648: -- User 'travishegner' has created a pull request for this issue: https://github.com/apache/spark/pull/9495 > Spark-SQL JDBC fails to set a default precision and scale when they are not > defined in an oracle schema. > > > Key: SPARK-10648 > URL: https://issues.apache.org/jira/browse/SPARK-10648 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.5.0 > Environment: using oracle 11g, ojdbc7.jar >Reporter: Travis Hegner > > Using oracle 11g as a datasource with ojdbc7.jar. When importing data into a > scala app, I am getting an exception "Overflowed precision". Some times I > would get the exception "Unscaled value too large for precision". > This issue likely affects older versions as well, but this was the version I > verified it on. > I narrowed it down to the fact that the schema detection system was trying to > set the precision to 0, and the scale to -127. > I have a proposed pull request to follow. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-10648) Spark-SQL JDBC fails to set a default precision and scale when they are not defined in an oracle schema.
[ https://issues.apache.org/jira/browse/SPARK-10648?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14990785#comment-14990785 ] Yin Huai commented on SPARK-10648: -- https://github.com/apache/spark/pull/8780#issuecomment-145598968 and https://github.com/apache/spark/pull/8780#issuecomment-144541760 have the workaround. > Spark-SQL JDBC fails to set a default precision and scale when they are not > defined in an oracle schema. > > > Key: SPARK-10648 > URL: https://issues.apache.org/jira/browse/SPARK-10648 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.5.0 > Environment: using oracle 11g, ojdbc7.jar >Reporter: Travis Hegner > > Using oracle 11g as a datasource with ojdbc7.jar. When importing data into a > scala app, I am getting an exception "Overflowed precision". Some times I > would get the exception "Unscaled value too large for precision". > This issue likely affects older versions as well, but this was the version I > verified it on. > I narrowed it down to the fact that the schema detection system was trying to > set the precision to 0, and the scale to -127. > I have a proposed pull request to follow. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-10648) Spark-SQL JDBC fails to set a default precision and scale when they are not defined in an oracle schema.
[ https://issues.apache.org/jira/browse/SPARK-10648?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14791081#comment-14791081 ] Apache Spark commented on SPARK-10648: -- User 'travishegner' has created a pull request for this issue: https://github.com/apache/spark/pull/8780 > Spark-SQL JDBC fails to set a default precision and scale when they are not > defined in an oracle schema. > > > Key: SPARK-10648 > URL: https://issues.apache.org/jira/browse/SPARK-10648 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 1.5.0 > Environment: using oracle 11g, ojdbc7.jar >Reporter: Travis Hegner > > Using oracle 11g as a datasource with ojdbc7.jar. When importing data into a > scala app, I am getting an exception "Overflowed precision". Some times I > would get the exception "Unscaled value too large for precision". > This issue likely affects older versions as well, but this was the version I > verified it on. > I narrowed it down to the fact that the schema detection system was trying to > set the precision to 0, and the scale to -127. > I have a proposed pull request to follow. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org