If you are talking about a stand alone program, have a look at this doc.
https://spark.apache.org/docs/0.9.1/python-programming-guide.html#standalone-programs
from pyspark import SparkConf, SparkContext
from pyspark.sql import HiveContext
conf = (SparkConf()
.setMaster(local)
I'm running the latest version of spark with Hadoop 1.x and scala 2.9.3 and
hive 0.9.0.
When using python 2.7
from pyspark.sql import HiveContext
sqlContext = HiveContext(sc)
I'm getting 'sc not defined'
On the other hand, I can see 'sc' from pyspark CLI.
Is there a way to fix it?