Github user jmdvinodjmd commented on the pull request:

    https://github.com/apache/spark/pull/5447#issuecomment-135462162
  
    I am facing following problem with C:\spark-1.4.1-bin-hadoop2.6.
    Some people referred that this is fixed in 1.4 but I am still getting this 
error on windows 7.
    Jarl Haggerty, reffered this problem in 
https://issues.apache.org/jira/browse/SPARK-6568.
    Moreover some more people talked about this(following exception) in PR 5447.
    
    Exception-
    --------------------------
    Exception                                 Traceback (most recent call last)
    <ipython-input-1-3d712d40f587> in <module>()
         10 
         11 # Initialize PySpark to predefine the SparkContext variable 'sc'
    ---> 12 execfile(os.path.join(spark_home, 'python/pyspark/shell.py'))
    
    C:\spark-1.4.1-bin-hadoop2.6\python/pyspark/shell.py in <module>()
         41     SparkContext.setSystemProperty("spark.executor.uri", 
os.environ["SPARK_EXECUTOR_URI"])
         42 
    ---> 43 sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
         44 atexit.register(lambda: sc.stop())
         45 
    
    C:\spark-1.4.1-bin-hadoop2.6/python\pyspark\context.pyc in __init__(self, 
master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, 
gateway, jsc, profiler_cls)
        108         """
        109         self._callsite = first_spark_call() or CallSite(None, None, 
None)
    --> 110         SparkContext._ensure_initialized(self, gateway=gateway)
        111         try:
        112             self._do_init(master, appName, sparkHome, pyFiles, 
environment, batchSize, serializer,
    
    C:\spark-1.4.1-bin-hadoop2.6/python\pyspark\context.pyc in 
_ensure_initialized(cls, instance, gateway)
        227         with SparkContext._lock:
        228             if not SparkContext._gateway:
    --> 229                 SparkContext._gateway = gateway or launch_gateway()
        230                 SparkContext._jvm = SparkContext._gateway.jvm
        231 
    
    C:\spark-1.4.1-bin-hadoop2.6/python\pyspark\java_gateway.pyc in 
launch_gateway()
         87                 callback_socket.close()
         88         if gateway_port is None:
    ---> 89             raise Exception("Java gateway process exited before 
sending the driver its port number")
         90 
         91         # In Windows, ensure the Java child processes do not linger 
after Python has exited.
    
    Exception: Java gateway process exited before sending the driver its port 
number
    ---------------------------------------------------
    I got this problem while runnning following program in ipython notebook.
    Program-
    -------
    import os
    import sys
    
    spark_home = os.environ.get('SPARK_HOME', None)
    sys.path.insert(0, spark_home + "/python")
    
    # Add the py4j to the path.
    # You may need to change the version number to match your install
    sys.path.insert(0, os.path.join(spark_home, 
'python/lib/py4j-0.8.2.1-src.zip'))
    
    # Initialize PySpark to predefine the SparkContext variable 'sc'
    execfile(os.path.join(spark_home, 'python/pyspark/shell.py'))
    ----------------------------------
    
    Can somebody suggest some solution to this problem.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to