[ 
https://issues.apache.org/jira/browse/SPARK-8675?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Min Zhou updated SPARK-8675:
----------------------------
    Description: 
AFAIK, some spark application always use LocalBackend to do some local 
initiatives, spark sql is an example.  Starting a LocalPoint won't add user 
classpath into executor. 

{noformat}
  override def start() {
    localEndpoint = SparkEnv.get.rpcEnv.setupEndpoint(
      "LocalBackendEndpoint", new LocalEndpoint(SparkEnv.get.rpcEnv, scheduler, 
this, totalCores))
  }
{noformat}

Thus will cause local executor fail with those scenarios,  loading hadoop 
built-in native libraries,  loading other user defined native libraries, 
loading user jars,  reading s3 config from a site.xml file, etc
 

  was:
AFAIK, some spark application always use LocalBackend to do some local 
initiatives, spark sql is an example.  Starting a LocalPoint won't add user 
classpath into executor. 

{noformat}
  override def start() {
    localEndpoint = SparkEnv.get.rpcEnv.setupEndpoint(
      "LocalBackendEndpoint", new LocalEndpoint(SparkEnv.get.rpcEnv, scheduler, 
this, totalCores))
  }
{noformat}

Thus will cause local executor fail with those scenarios,  load hadoop built-in 
native libraries,  load other user defined native libraries, load user jars,  
read s3 config from a site.xml file, etc
 


> Executors created by LocalBackend won't get the same classpath as other 
> executor backends 
> ------------------------------------------------------------------------------------------
>
>                 Key: SPARK-8675
>                 URL: https://issues.apache.org/jira/browse/SPARK-8675
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.4.0
>            Reporter: Min Zhou
>
> AFAIK, some spark application always use LocalBackend to do some local 
> initiatives, spark sql is an example.  Starting a LocalPoint won't add user 
> classpath into executor. 
> {noformat}
>   override def start() {
>     localEndpoint = SparkEnv.get.rpcEnv.setupEndpoint(
>       "LocalBackendEndpoint", new LocalEndpoint(SparkEnv.get.rpcEnv, 
> scheduler, this, totalCores))
>   }
> {noformat}
> Thus will cause local executor fail with those scenarios,  loading hadoop 
> built-in native libraries,  loading other user defined native libraries, 
> loading user jars,  reading s3 config from a site.xml file, etc
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to