pranludi created SPARK-18876: -------------------------------- Summary: An error occurred while trying to connect to the Java server Key: SPARK-18876 URL: https://issues.apache.org/jira/browse/SPARK-18876 Project: Spark Issue Type: Bug Components: PySpark Affects Versions: 2.0.2, 2.0.1 Environment: Python 2.7.12 Reporter: pranludi
I am trying to create spark context object with the following commands on pyspark: ERROR:py4j.java_gateway:An error occurred while trying to connect to the Java server (127.0.0.1:35918) Traceback (most recent call last): File "/usr/local/lib/python2.7/site-packages/py4j/java_gateway.py", line 963, in start self.socket.connect((self.address, self.port)) File "/usr/local/lib/python2.7/socket.py", line 228, in meth return getattr(self._sock,name)(*args) error: [Errno 111] Connection refused Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/home/gamedev/spark-2.0.1-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/sql/dataframe.py", line 419, in coalesce File "/usr/local/lib/python2.7/site-packages/py4j/java_gateway.py", line 1131, in __call__ answer = self.gateway_client.send_command(command) File "/usr/local/lib/python2.7/site-packages/py4j/java_gateway.py", line 881, in send_command connection = self._get_connection() File "/usr/local/lib/python2.7/site-packages/py4j/java_gateway.py", line 829, in _get_connection connection = self._create_connection() File "/usr/local/lib/python2.7/site-packages/py4j/java_gateway.py", line 835, in _create_connection connection.start() File "/usr/local/lib/python2.7/site-packages/py4j/java_gateway.py", line 970, in start raise Py4JNetworkError(msg, e) py4j.protocol.Py4JNetworkError: An error occurred while trying to connect to the Java server (127.0.0.1:35918) --------------------------------------------- I try spark version 2.0.0, 2.0.1, 2.0.2 no problem 2.0.0 but 2.0.1, 2.0.2 occur python code ------ ..... df = spark.read.json('hdfs://big_big_4000000.json') json_log = [] for log in df.collect(): jj = {} try: for f in log.__fields__: if f == 'I_LogDes': if log[f] is not None: log_des_json = json.loads(log[f]) for jf in log_des_json: json_key = add_2(jf) if json_key in jj: json_key = '%s_2' % json_key jj[json_key] = typeIntStr(log_des_json[jf]) else: jj[remove_i(f)] = typeIntStr(log[f]) json_log.append(jj) except: print log # !!! here error occur df = spark.read.json(spark.sparkContext.parallelize(json_log)) -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org