I get the same error with the JDBC Datasource as well

0: jdbc:hive2://localhost:10000> CREATE TABLE jtest USING jdbc OPTIONS
> ("url" "jdbc:mysql://localhost/test", "driver" "com.mysql.jdbc.Driver",
> "dbtable" "stats");
> +---------+--+
> | Result  |
> +---------+--+
> +---------+--+
> No rows selected (0.156 seconds)
>


0: jdbc:hive2://localhost:10000> CREATE TABLE test_stored STORED AS PARQUET
> LOCATION  '/Users/kiran/spark/test5.parquet' AS SELECT * FROM jtest;
> Error: java.lang.AssertionError: assertion failed: No plan for
> InsertIntoTable
> Relation[id#14,stat_repository_type#15,stat_repository_id#16,stat_holder_type#17,stat_holder_id#18,stat_coverage_type#19,stat_coverage_id#20,stat_membership_type#21,stat_membership_id#22,context#23]
> parquet, true, false
> +-
> Relation[id#4,stat_repository_type#5,stat_repository_id#6,stat_holder_type#7,stat_holder_id#8,stat_coverage_type#9,stat_coverage_id#10,stat_membership_type#11,stat_membership_id#12,context#13]
> JDBCRelation(stats) (state=,code=0)
>

JDBCRelation also extends the BaseRelation as well. Is there any workaround
for the Datasources that extend BaseRelation ?



On Sun, Nov 6, 2016 at 8:08 PM, Kiran Chitturi <
kiran.chitt...@lucidworks.com> wrote:

> Hello,
>
> I am encountering a new problem with Spark 2.0.1 that didn't happen with
> Spark 1.6.x.
>
> These SQL statements ran successfully spark-thrift-server in 1.6.x
>
>
>> CREATE TABLE test2 USING solr OPTIONS (zkhost "localhost:9987",
>> collection "test", fields "id" );
>>
>> CREATE TABLE test_stored STORED AS PARQUET LOCATION
>>  '/Users/kiran/spark/test.parquet' AS SELECT * FROM test;
>
>
> but with Spark 2.0.x, the last statement throws this below error
>
>
>> CREATE TABLE test_stored1 STORED AS PARQUET LOCATION
>
> '/Users/kiran/spark/test.parquet' AS SELECT * FROM test2;
>
>
>
>
>
> Error: java.lang.AssertionError: assertion failed: No plan for
>> InsertIntoTable Relation[id#3] parquet, true, false
>> +- Relation[id#2] com.lucidworks.spark.SolrRelation@57d735e9
>> (state=,code=0)
>
>
> The full stack trace is at https://gist.github.com/kiranchitturi/
> 8b3637723e0887f31917f405ef1425a1
>
> SolrRelation class (https://github.com/lucidworks/spark-solr/blob/
> master/src/main/scala/com/lucidworks/spark/SolrRelation.scala)
>
> This error message doesn't seem very meaningful to me. I am not quite sure
> how to track this down or fix this. Is there something I need to implement
> in the SolrRelation class to be able to create Parquet tables from Solr
> tables.
>
> Looking forward to your suggestions.
>
> Thanks,
> --
> Kiran Chitturi
>
>


-- 
Kiran Chitturi

Reply via email to