...@spark.apache.org
Cc: user@spark.apache.orgmailto:user@spark.apache.org
user@spark.apache.orgmailto:user@spark.apache.org
Subject: Re: HiveContext: cache table not supported for partitioned table?
Cache table works with partitioned table.
I guess you’re experimenting with a default local metastore
Hi,
In Spark 1.1 HiveContext, I ran a create partitioned table command followed by
a cache table command and got a java.sql.SQLSyntaxErrorException: Table/View
'PARTITIONS' does not exist. But cache table worked fine if the table is not a
partitioned table.
Can anybody confirm that cache of
Cache table works with partitioned table.
I guess you’re experimenting with a default local metastore and the
metastore_db directory doesn’t exist at the first place. In this case,
all metastore tables/views don’t exist at first and will throw the error
message you saw when the |PARTITIONS|