Here is the stack trace. The first part shows the log when the session is
started in Tableau. It is using the init sql option on the data
connection to create theTEMPORARY table myNodeTable.
Ah, I see. thanks for providing the error. The problem here is that
temporary tables do not exist in
In 1.2.1 of I was persisting a set of parquet files as a table for use by
spark-sql cli later on. There was a post here
http://apache-spark-user-list.1001560.n3.nabble.com/persist-table-schema-in-spark-sql-tt16297.html#a16311
by
Mchael Armbrust that provide a nice little helper method for dealing
Hey Todd,
In migrating to 1.3.x I see that the spark.sql.hive.convertMetastoreParquet
is no longer public, so the above no longer works.
This was probably just a typo, but to be clear,
spark.sql.hive.convertMetastoreParquet is still a supported option and
should work. You are correct that