Hey Jerrick,
What do you mean by "schema evolution with Hive metastore tables"? Hive
doesn't take schema evolution into account. Could you please give a
concrete use case? Are you trying to write Parquet data with extra
columns into an existing metastore Parquet table?
Cheng
On 7/21/15 1:04 AM, Jerrick Hoang wrote:
I'm new to Spark, any ideas would be much appreciated! Thanks
On Sat, Jul 18, 2015 at 11:11 AM, Jerrick Hoang
<jerrickho...@gmail.com <mailto:jerrickho...@gmail.com>> wrote:
Hi all,
I'm aware of the support for schema evolution via DataFrame API.
Just wondering what would be the best way to go about dealing with
schema evolution with Hive metastore tables. So, say I create a
table via SparkSQL CLI, how would I deal with Parquet schema
evolution?
Thanks,
J