>
> 1. “val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)”
> doesn’t work because “HiveContext not a member of
> org.apache.spark.sql.hive”  I checked the documentation, and it looks like
> it should still work for spark-2.0.0-preview-bin-hadoop2.7.tgz
>

HiveContext has been deprecated and moved to a 1.x compatibility package,
which you'll need to include explicitly.  Docs have not been updated yet.


> 2. I also tried the new spark session, ‘spark.table(“db.table”)’, it fails
> with a HDFS permission denied can’t write to “/user/hive/warehouse”
>

Where are the HDFS configurations located?  We might not be propagating
them correctly any more.

Reply via email to