This is for Spark 2.0:

If I wanted Hive support on a new SparkSession, I would build it with:

spark = SparkSession \
    .builder \
    .enableHiveSupport() \
    .getOrCreate()

However, PySpark already creates a SparkSession for me, which appears to
lack HiveSupport. How can I either:

(a) Add Hive support to an existing SparkSession,

or

(b) Configure PySpark so that the SparkSession it creates at startup has
Hive support enabled?

Thanks!

Apu

Reply via email to