I'm running the latest version of spark with Hadoop 1.x and scala 2.9.3 and 
hive 0.9.0.

When using python 2.7
from pyspark.sql import HiveContext
sqlContext = HiveContext(sc)

I'm getting 'sc not defined'

On the other hand, I can see 'sc' from pyspark CLI.

Is there a way to fix it?

Reply via email to