It says permission denied, just make sure the user ashish has permission over the directory /home/ashish/Downloads/spark-1.1.0-bin-hadoop2.4.
chown -R ashish:ashish /home/ashish/Downloads/spark-1.1.* would do. Thanks Best Regards On Mon, Feb 9, 2015 at 12:47 PM, Ashish Kumar <ashish.ku...@innovaccer.com> wrote: > > Traceback (most recent call last): > File "pi.py", line 29, in <module> > sc = SparkContext(appName="PythonPi") > File > "/home/ashish/Downloads/spark-1.1.0-bin-hadoop2.4/python/pyspark/context.py", > line 104, in __init__ > SparkContext._ensure_initialized(self, gateway=gateway) > File > "/home/ashish/Downloads/spark-1.1.0-bin-hadoop2.4/python/pyspark/context.py", > line 211, in _ensure_initialized > SparkContext._gateway = gateway or launch_gateway() > File > "/home/ashish/Downloads/spark-1.1.0-bin-hadoop2.4/python/pyspark/java_gateway.py", > line 48, in launch_gateway > proc = Popen(command, stdout=PIPE, stdin=PIPE, preexec_fn=preexec_func) > File "/usr/lib/python2.7/subprocess.py", line 710, in __init__ > errread, errwrite) > File "/usr/lib/python2.7/subprocess.py", line 1327, in _execute_child > raise child_exception > OSError: [Errno 13] Permission denied > > > > >