Hi all,

I am setting up a system where spark contexts would be created by a web
server that would handle the computation and return the results. I have the
following code (in python)

os.environ['SPARK_HOME'] = "/home/spark/spark-1.0.0-bin-hadoop2/"
sc = SparkContext(master="spark://ip-xx-xx-xx-xx:7077", appName="Simple
App")
l =sc.parallelize([1,2,3,4])
c = l.count() 

but it throws an unrelated error 'TypeError: an integer is required' in the
last line.

I assume I did not setup the environment properly. I have added spark_home
and py4j source to the classpath. not sure what is missing.

thanks,





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/create-SparkContext-dynamically-tp7872.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to