[ 
https://issues.apache.org/jira/browse/SPARK-15909?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Liam Fisk updated SPARK-15909:
------------------------------
    Description: 
PySpark behaves differently if the SparkContext is created within the REPL (vs 
initialised by the shell).

My conf/spark-env.sh file contains:
{code}
#!/bin/bash
export SPARK_LOCAL_IP=172.20.30.158
export LIBPROCESS_IP=172.20.30.158
export MESOS_NATIVE_JAVA_LIBRARY=/usr/local/lib/libmesos.so
{code}

And when running pyspark it will correctly initialize my SparkContext. However, 
when I run:
{code}
from pyspark import SparkContext, SparkConf

sc.stop()
conf = (
    SparkConf()
        .setMaster("mesos://zk://foo:2181/mesos")
        .setAppName("Jupyter PySpark")
)

sc = SparkContext(conf=conf)
{code}
my _spark.driver.uri_ and URL classpath will point to localhost (preventing my 
mesos cluster from accessing the appropriate files)

  was:
PySpark behaves differently if the SparkContext is created within the REPL (vs 
initialised by the shell).

My conf/spark-env.sh file contains:
{code}
#!/bin/bash
export SPARK_LOCAL_IP=172.20.30.158
export LIBPROCESS_IP=172.20.30.158
export MESOS_NATIVE_JAVA_LIBRARY=/usr/local/lib/libmesos.so
{code}

And when running pyspark it will correctly initialize my SparkContext. However, 
when I run:
{code}
from pyspark import SparkContext, SparkConf

sc.stop()
conf = (
    SparkConf()
        .setMaster("mesos://zk://foo:2181/mesos")
        .setAppName("Jupyter PySpark")
)

sc = SparkContext(conf=conf)
{code}
my `spark.driver.uri` and URL classpath will point to localhost (preventing my 
mesos cluster from accessing the appropriate files)


> PySpark classpath uri incorrectly set
> -------------------------------------
>
>                 Key: SPARK-15909
>                 URL: https://issues.apache.org/jira/browse/SPARK-15909
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 1.6.1
>            Reporter: Liam Fisk
>
> PySpark behaves differently if the SparkContext is created within the REPL 
> (vs initialised by the shell).
> My conf/spark-env.sh file contains:
> {code}
> #!/bin/bash
> export SPARK_LOCAL_IP=172.20.30.158
> export LIBPROCESS_IP=172.20.30.158
> export MESOS_NATIVE_JAVA_LIBRARY=/usr/local/lib/libmesos.so
> {code}
> And when running pyspark it will correctly initialize my SparkContext. 
> However, when I run:
> {code}
> from pyspark import SparkContext, SparkConf
> sc.stop()
> conf = (
>     SparkConf()
>         .setMaster("mesos://zk://foo:2181/mesos")
>         .setAppName("Jupyter PySpark")
> )
> sc = SparkContext(conf=conf)
> {code}
> my _spark.driver.uri_ and URL classpath will point to localhost (preventing 
> my mesos cluster from accessing the appropriate files)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to