[ 
https://issues.apache.org/jira/browse/SPARK-21686?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-21686.
-----------------------------------
    Resolution: Duplicate

> spark.sql.hive.convertMetastoreOrc is causing NullPointerException while 
> reading ORC tables
> -------------------------------------------------------------------------------------------
>
>                 Key: SPARK-21686
>                 URL: https://issues.apache.org/jira/browse/SPARK-21686
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 1.6.1
>         Environment: spark_2_4_2_0_258-1.6.1.2.4.2.0-258.el6.noarch 
> spark_2_4_2_0_258-python-1.6.1.2.4.2.0-258.el6.noarch 
> spark_2_4_2_0_258-yarn-shuffle-1.6.1.2.4.2.0-258.el6.noarch
> RHEL-7 (64-Bit)
> JDK 1.8
>            Reporter: Ernani Pereira de Mattos Junior
>
> The issue is very similar to SPARK-10304; 
> Spark Query throws a NullPointerException. 
> >>> sqlContext.sql('select * from core_next.spark_categorization').show(57) 
> 17/06/19 11:26:54 ERROR Executor: Exception in task 2.0 in stage 21.0 (TID 
> 48) 
> java.lang.NullPointerException 
> at 
> org.apache.spark.sql.hive.HiveInspectors$class.unwrapperFor(HiveInspectors.scala:488)
>  
> at 
> org.apache.spark.sql.hive.orc.OrcTableScan.unwrapperFor(OrcRelation.scala:244)
>  
> at 
> org.apache.spark.sql.hive.orc.OrcTableScan$$anonfun$org$apache$spark$sql$hive$orc$OrcTableScan$$fillObject$1$$anonfun$6.apply(OrcRelation.scala:275)
>  
> at 
> org.apache.spark.sql.hive.orc.OrcTableScan$$anonfun$org$apache$spark$sql$hive$orc$OrcTableScan$$fillObject$1$$anonfun$6.apply(OrcRelation.scala:275)
>  
> Turn off ORC optimizations and issue was resolved: 
> "sqlContext.setConf("spark.sql.hive.convertMetastoreOrc", "false")



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to