Hello,

I am encountering a new problem with Spark 2.0.1 that didn't happen with
Spark 1.6.x.

These SQL statements ran successfully spark-thrift-server in 1.6.x


> CREATE TABLE test2 USING solr OPTIONS (zkhost "localhost:9987", collection
> "test", fields "id" );
>
> CREATE TABLE test_stored STORED AS PARQUET LOCATION
>  '/Users/kiran/spark/test.parquet' AS SELECT * FROM test;


but with Spark 2.0.x, the last statement throws this below error


> CREATE TABLE test_stored1 STORED AS PARQUET LOCATION

'/Users/kiran/spark/test.parquet' AS SELECT * FROM test2;





Error: java.lang.AssertionError: assertion failed: No plan for
> InsertIntoTable Relation[id#3] parquet, true, false
> +- Relation[id#2] com.lucidworks.spark.SolrRelation@57d735e9
> (state=,code=0)


The full stack trace is at
https://gist.github.com/kiranchitturi/8b3637723e0887f31917f405ef1425a1

SolrRelation class (
https://github.com/lucidworks/spark-solr/blob/master/src/main/scala/com/lucidworks/spark/SolrRelation.scala
)

This error message doesn't seem very meaningful to me. I am not quite sure
how to track this down or fix this. Is there something I need to implement
in the SolrRelation class to be able to create Parquet tables from Solr
tables.

Looking forward to your suggestions.

Thanks,
-- 
Kiran Chitturi

Reply via email to