[ 
https://issues.apache.org/jira/browse/SPARK-13767?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Poonam Agrawal updated SPARK-13767:
-----------------------------------
    Description: 
I am trying to create spark context object with the following commands on 
pyspark:

from pyspark import SparkContext, SparkConf
conf = 
SparkConf().setAppName('App_name').setMaster("spark://local-or-remote-ip:7077").set('spark.cassandra.connection.host',
 'cassandra-machine-ip').set('spark.storage.memoryFraction', 
'0.2').set('spark.rdd.compress', 'true').set('spark.streaming.blockInterval', 
500).set('spark.serializer', 
'org.apache.spark.serializer.KryoSerializer').set('spark.scheduler.mode', 
'FAIR').set('spark.mesos.coarse', 'true')
sc = SparkContext(conf=conf)


but I am getting the following error:

Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/spark-1.4.1/python/pyspark/conf.py", line 106, in __init__
  self._jconf = _jvm.SparkConf(loadDefaults)
File 
"/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
 line 766, in __getattr__
File 
"/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
 line 362, in send_command
File 
"/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
 line 318, in _get_connection
File 
"/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
 line 325, in _create_connection
File 
"/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
 line 432, in start
py4j.protocol.Py4JNetworkError: An error occurred while trying to connect to 
the Java server

I am getting the same error executing the command : conf = 
SparkConf().setAppName("App_name").setMaster("spark://127.0.0.1:7077")




  was:
I am trying to create spark context object with the following commands on 
pyspark:

from pyspark import SparkContext, SparkConf
conf = 
SparkConf().setAppName('App_name').setMaster("spark://local-or-remote-ip:7077").set('spark.cassandra.connection.host',
 'cassandra-machine-ip').set('spark.storage.memoryFraction', 
'0.2').set('spark.rdd.compress', 'true').set('spark.streaming.blockInterval', 
500).set('spark.serializer', 
'org.apache.spark.serializer.KryoSerializer').set('spark.scheduler.mode', 
'FAIR').set('spark.mesos.coarse', 'true')
sc = SparkContext(conf=conf)


but I am getting the following error:

Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/spark-1.4.1/python/pyspark/conf.py", line 106, in __init__
  self._jconf = _jvm.SparkConf(loadDefaults)
File 
"/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
 line 766, in __getattr__
File 
"/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
 line 362, in send_command
File 
"/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
 line 318, in _get_connection
File 
"/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
 line 325, in _create_connection
File 
"/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
 line 432, in start
py4j.protocol.Py4JNetworkError: An error occurred while trying to connect to 
the Java server





> py4j.protocol.Py4JNetworkError: An error occurred while trying to connect to 
> the Java server
> --------------------------------------------------------------------------------------------
>
>                 Key: SPARK-13767
>                 URL: https://issues.apache.org/jira/browse/SPARK-13767
>             Project: Spark
>          Issue Type: Bug
>            Reporter: Poonam Agrawal
>
> I am trying to create spark context object with the following commands on 
> pyspark:
> from pyspark import SparkContext, SparkConf
> conf = 
> SparkConf().setAppName('App_name').setMaster("spark://local-or-remote-ip:7077").set('spark.cassandra.connection.host',
>  'cassandra-machine-ip').set('spark.storage.memoryFraction', 
> '0.2').set('spark.rdd.compress', 'true').set('spark.streaming.blockInterval', 
> 500).set('spark.serializer', 
> 'org.apache.spark.serializer.KryoSerializer').set('spark.scheduler.mode', 
> 'FAIR').set('spark.mesos.coarse', 'true')
> sc = SparkContext(conf=conf)
> but I am getting the following error:
> Traceback (most recent call last):
> File "<stdin>", line 1, in <module>
> File "/usr/local/lib/spark-1.4.1/python/pyspark/conf.py", line 106, in 
> __init__
>   self._jconf = _jvm.SparkConf(loadDefaults)
> File 
> "/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
>  line 766, in __getattr__
> File 
> "/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
>  line 362, in send_command
> File 
> "/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
>  line 318, in _get_connection
> File 
> "/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
>  line 325, in _create_connection
> File 
> "/usr/local/lib/spark-1.4.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
>  line 432, in start
> py4j.protocol.Py4JNetworkError: An error occurred while trying to connect to 
> the Java server
> I am getting the same error executing the command : conf = 
> SparkConf().setAppName("App_name").setMaster("spark://127.0.0.1:7077")



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to