Hi,
    Am trying to use JavaSparkContext() to create a new SparkContext and 
attempted to pass the requisite jars. But looks like they aren't getting added 
to the distributed cache automatically. Looking into 
YarnClientSchedulerBackend::start() and ClientArguments, it did seem like it 
would just add the SPARK_JAR and APP_JAR. Am wondering what is the best way to 
add additional files to Distributed cache and also have them appear in the 
classpath for ExecutorLauncher.

Thanks
Srikanth Sundarrajan
                                          

Reply via email to