As far as I can tell Spark does not support update to ORC tables.
This is because Spark needs to send heartbeat to Hive metadata and maintain
in throughout DML transaction operation (delete, updates here) and that is
not implemented.
For the same token if you have performed DML on ORC table in Hi
Thank you for your answer.
I’m using ORC transactional table right now. But i’m not stuck with that. When
I send an SQL statement like the following, where old_5sek_agg and new_5sek_agg
are registered temp tables, I’ll get an exception in spark. Same without
subselect.
sqlContext.sql("DELETE F
ibed before and maybe also
> want to send specific CREATE TABLE syntax for columnar store and time
> table.
>
> Thank you very much in advance. I'm a little stuck on this one.
>
> Regards
> Sascha
>
>
>
> --
> View this message in context: http://apache-sp
0.n3.nabble.com/Incremental-Updates-and-custom-SQL-via-JDBC-tp27598.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org