[ 
https://issues.apache.org/jira/browse/SPARK-3991?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-3991:
-----------------------------
    Priority: Major  (was: Blocker)

Downgrading until it's clear what the issue is. There are several items here.

1. This sounds like the same issue raised in SPARK-4944
2. You might need to provide more info, like what the nature of the join is
3. This sounds related to SPARK-3914, maybe solved by it

I suggest tracking one issue per JIRA. If one of these are still relevant and 
not duplicates, maybe this can change to track that one, and if there are more 
than one, track one here and create another JIRA for another.

> Not Serializable , Nullpinter Exceptions in SQL server mode
> -----------------------------------------------------------
>
>                 Key: SPARK-3991
>                 URL: https://issues.apache.org/jira/browse/SPARK-3991
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.1.0
>            Reporter: eblaas
>         Attachments: not_serializable_exception.patch
>
>
> I'm working on connecting Mondrian with Spark SQL via JDBC. Good news, it 
> works but there are some bugs to fix.
> I customized the HiveThriftServer2 class to load, transform and register 
> tables (ETL) with the HiveContext. Data tables are generated from Cassandra 
> and from a relational database.
> * 1 st problem : 
> hiveContext.registerRDDAsTable(treeSchema,"tree") , does not register the 
> table in hive metastore ("show tables;" via JDBC does not list the table, but 
> I can query it e.g. select * from tree) dirty workaround create a table with 
> same name and schema, this was necessary because mondrian validates table 
> existence 
> hiveContext.sql("CREATE TABLE tree (dp_id BIGINT, h1 STRING, h2 STRING, h3 
> STRING)")
> * 2 nd problem :
> mondrian creates complex joins, witch results in Serialization Exceptions
> 2 classes in hibeUdfs.scala have to be serializable
> - DeferredObjectAdapter and HiveGenericUdaf
> * 3 td  problem
> Nullpointer Exception in InMemoryRelation
> 42: override lazy val statistics =  Statistics(sizeInBytes = 
> child.sqlContext.defaultSizeInBytes)
> the sqlContext in child was null, quick fix set default value from 
> SparkContext
> override lazy val statistics = Statistics(sizeInBytes = 10000)
> I'm not sure how to fix this bugs but with the patch file it works at least. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to