Hi,

I think you got this error because you used `NUMERIC` types in your schema (
https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/jdbc/OracleDialect.scala#L32).
So, IIUC avoiding the type is a workaround.

// maropu


On Fri, Jan 27, 2017 at 8:18 AM, ayan guha <guha.a...@gmail.com> wrote:

> Hi
>
> I am facing exact issue with Oracle/Exadataas mentioned here
> <http://stackoverflow.com/questions/41873449/sparksql-key-not-found-scale>.
> Any idea? I could not figure out so sending to this grou hoping someone
> have see it (and solved it)
>
> Spark Version: 1.6
> pyspark command:
>
> pyspark --driver-class-path /opt/oracle/bigdatasql/bdcell-
> 12.1/jlib-bds/kvclient.jar:/opt/oracle/bigdatasql/bdcell-
> 12.1/jlib-bds/ojdbc7.jar:/opt/oracle/bigdatasql/bdcell-12.1/
> jlib-bds/ojdbc7-orig.jar:/opt/oracle/bigdatasql/bdcell-12.1/
> jlib-bds/oracle-hadoop-sql.jar:/opt/oracle/bigdatasql/
> bdcell-12.1/jlib-bds/ora-hadoop-common.jar:/opt/oracle/
> bigdatasql/bdcell-12.1/jlib-bds/ora-hadoop-common-orig.
> jar:/opt/oracle/bigdatasql/bdcell-12.1/jlib-bds/orahivedp.jar:/opt/oracle/
> bigdatasql/bdcell-12.1/jlib-bds/orahivedp-orig.jar:/opt/
> oracle/bigdatasql/bdcell-12.1/jlib-bds/orai18n.jar:/opt/
> oracle/bigdatasql/bdcell-12.1/jlib-bds/orai18n-orig.jar:/
> opt/oracle/bigdatasql/bdcell-12.1/jlib-bds/oraloader.jar:/
> opt/oracle/bigdatasql/bdcell-12.1/jlib-bds/oraloader-orig.jar   --conf
> spark.jars=/opt/oracle/bigdatasql/bdcell-12.1/jlib-
> bds/oracle-hadoop-sql.jar,/opt/oracle/bigdatasql/bdcell-
> 12.1/jlib-bds/ora-hadoop-common.jar,/opt/oracle/
> bigdatasql/bdcell-12.1/jlib-bds/orahivedp.jar,/opt/oracle/
> bigdatasql/bdcell-12.1/jlib-bds/oraloader.jar,/opt/oracle/
> bigdatasql/bdcell-12.1/jlib-bds/ojdbc7.jar,/opt/oracle/
> bigdatasql/bdcell-12.1/jlib-bds/orai18n.jar/opt/oracle/
> bigdatasql/bdcell-12.1/jlib-bds/kvclient.jar,/opt/oracle/
> bigdatasql/bdcell-12.1/jlib-bds/ojdbc7.jar,/opt/oracle/
> bigdatasql/bdcell-12.1/jlib-bds/ojdbc7-orig.jar,/opt/
> oracle/bigdatasql/bdcell-12.1/jlib-bds/oracle-hadoop-sql.
> jar,/opt/oracle/bigdatasql/bdcell-12.1/jlib-bds/ora-
> hadoop-common.jar,/opt/oracle/bigdatasql/bdcell-12.1/jlib-
> bds/ora-hadoop-common-orig.jar,/opt/oracle/bigdatasql/
> bdcell-12.1/jlib-bds/orahivedp.jar,/opt/oracle/
> bigdatasql/bdcell-12.1/jlib-bds/orahivedp-orig.jar,/opt/
> oracle/bigdatasql/bdcell-12.1/jlib-bds/orai18n.jar,/opt/
> oracle/bigdatasql/bdcell-12.1/jlib-bds/orai18n-orig.jar,/
> opt/oracle/bigdatasql/bdcell-12.1/jlib-bds/oraloader.jar,/
> opt/oracle/bigdatasql/bdcell-12.1/jlib-bds/oraloader-orig.jar
>
>
> Here is my code:
>
> url="jdbc:oracle:thin:@mpimpclu1-scan:1521/DEVAIM"
> table = "HIST_FORECAST_NEXT_BILL_DGTL"
> user = "bal"
> password= "bal"
> driver="oracle.jdbc.OracleDriver"
> df = sqlContext.read.jdbc(url=url,table=table,properties={"user"
> :user,"password":password,"driver":driver})
>
>
> Error:
> Traceback (most recent call last):
>   File "<stdin>", line 1, in <module>
>   File "/opt/cloudera/parcels/CDH-5.8.3-1.cdh5.8.3.p2001.2081/lib/
> spark/python/pyspark/sql/readwriter.py", line 289, in jdbc
>     return self._df(self._jreader.jdbc(url, table, jprop))
>   File "/opt/cloudera/parcels/CDH-5.8.3-1.cdh5.8.3.p2001.2081/lib/
> spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 813, in
> __call__
>   File "/opt/cloudera/parcels/CDH-5.8.3-1.cdh5.8.3.p2001.2081/lib/
> spark/python/pyspark/sql/utils.py", line 45, in deco
>     return f(*a, **kw)
>   File "/opt/cloudera/parcels/CDH-5.8.3-1.cdh5.8.3.p2001.2081/lib/
> spark/python/lib/py4j-0.9-src.zip/py4j/protocol.py", line 308, in
> get_return_value
> py4j.protocol.Py4JJavaError: An error occurred while calling o40.jdbc.
> : java.util.NoSuchElementException: key not found: scale
>         at scala.collection.MapLike$class.default(MapLike.scala:228)
>         at scala.collection.AbstractMap.default(Map.scala:58)
>         at scala.collection.MapLike$class.apply(MapLike.scala:141)
>         at scala.collection.AbstractMap.apply(Map.scala:58)
>         at org.apache.spark.sql.types.Metadata.get(Metadata.scala:108)
>         at org.apache.spark.sql.types.Metadata.getLong(Metadata.scala:51)
>         at org.apache.spark.sql.jdbc.OracleDialect$.
> getCatalystType(OracleDialect.scala:33)
>         at org.apache.spark.sql.execution.datasources.jdbc.
> JDBCRDD$.resolveTable(JDBCRDD.scala:140)
>         at org.apache.spark.sql.execution.datasources.jdbc.
> JDBCRelation.<init>(JDBCRelation.scala:91)
>         at org.apache.spark.sql.DataFrameReader.jdbc(
> DataFrameReader.scala:222)
>         at org.apache.spark.sql.DataFrameReader.jdbc(
> DataFrameReader.scala:146)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
>         at py4j.reflection.ReflectionEngine.invoke(
> ReflectionEngine.java:381)
>         at py4j.Gateway.invoke(Gateway.java:259)
>         at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.
> java:133)
>         at py4j.commands.CallCommand.execute(CallCommand.java:79)
>         at py4j.GatewayConnection.run(GatewayConnection.java:209)
>         at java.lang.Thread.run(Thread.java:745)
>
>
>
> --
> Best Regards,
> Ayan Guha
>



-- 
---
Takeshi Yamamuro

Reply via email to