Currently no if you don't want to use Spark SQL's HiveContext. But we're
working on adding partitioning support to the external data sources API,
with which you can create, for example, partitioned Parquet tables
without using Hive.
Cheng
On 1/26/15 8:47 AM, Danny Yates wrote:
Thanks Michael.
I'm not actually using Hive at the moment - in fact, I'm trying to
avoid it if I can. I'm just wondering whether Spark has anything
similar I can leverage?
Thanks
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org