[GitHub] spark pull request: [SPARK-1177] Allow SPARK_JAR to be set program...

2014-07-11 Thread dbtsai
Github user dbtsai commented on the pull request: https://github.com/apache/spark/pull/987#issuecomment-48762832 #560 is merged. Close this PR. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not

[GitHub] spark pull request: [SPARK-1177] Allow SPARK_JAR to be set program...

2014-07-11 Thread dbtsai
Github user dbtsai closed the pull request at: https://github.com/apache/spark/pull/987 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is

[GitHub] spark pull request: [SPARK-1177] Allow SPARK_JAR to be set program...

2014-06-06 Thread vanzin
Github user vanzin commented on the pull request: https://github.com/apache/spark/pull/987#issuecomment-45357297 I mean you can set system properties the same way. SparkConf initializes its configuration from system properties, so my patch covers not only your case, but also others

[GitHub] spark pull request: [SPARK-1177] Allow SPARK_JAR to be set program...

2014-06-06 Thread dbtsai
Github user dbtsai commented on the pull request: https://github.com/apache/spark/pull/987#issuecomment-45363846 Got you. Looking forward to having your patch merged. Thanks. Sent from my Google Nexus 5 On Jun 6, 2014 9:35 AM, Marcelo Vanzin notificati...@github.com wrote:

[GitHub] spark pull request: [SPARK-1177] Allow SPARK_JAR to be set program...

2014-06-05 Thread dbtsai
GitHub user dbtsai opened a pull request: https://github.com/apache/spark/pull/987 [SPARK-1177] Allow SPARK_JAR to be set programmatically in system properties You can merge this pull request into a Git repository by running: $ git pull https://github.com/dbtsai/spark

[GitHub] spark pull request: [SPARK-1177] Allow SPARK_JAR to be set program...

2014-06-05 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/987#issuecomment-45284869 Build triggered. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have

[GitHub] spark pull request: [SPARK-1177] Allow SPARK_JAR to be set program...

2014-06-05 Thread vanzin
Github user vanzin commented on the pull request: https://github.com/apache/spark/pull/987#issuecomment-45284873 https://github.com/apache/spark/pull/560 has what I believe is a better way of handling this. --- If your project is set up for it, you can reply to this email and have

[GitHub] spark pull request: [SPARK-1177] Allow SPARK_JAR to be set program...

2014-06-05 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/987#issuecomment-45285273 Build triggered. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have

[GitHub] spark pull request: [SPARK-1177] Allow SPARK_JAR to be set program...

2014-06-05 Thread dbtsai
Github user dbtsai commented on the pull request: https://github.com/apache/spark/pull/987#issuecomment-45286460 @chesterxgchen #560 Agree, it's a more throughout way to handle this issue. In the code you have, it seems that the spark jar setting is moved to conf: SparkConf

[GitHub] spark pull request: [SPARK-1177] Allow SPARK_JAR to be set program...

2014-06-05 Thread vanzin
Github user vanzin commented on the pull request: https://github.com/apache/spark/pull/987#issuecomment-45290220 There's no need to change Client.scala with my change; all you need to do is set spark.yarn.jar somewhere - JVM system property, spark-defaults.conf, of in the app's code

[GitHub] spark pull request: [SPARK-1177] Allow SPARK_JAR to be set program...

2014-06-05 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/987#issuecomment-45292661 Build started. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this

[GitHub] spark pull request: [SPARK-1177] Allow SPARK_JAR to be set program...

2014-06-05 Thread dbtsai
Github user dbtsai commented on the pull request: https://github.com/apache/spark/pull/987#issuecomment-45292804 The app's code will only run in the application master in yarn-cluster mode, how can yarn client know which jar will be submitted to distributed cache if we set it in the

[GitHub] spark pull request: [SPARK-1177] Allow SPARK_JAR to be set program...

2014-06-05 Thread vanzin
Github user vanzin commented on the pull request: https://github.com/apache/spark/pull/987#issuecomment-45292958 Ok, in cluster mode you can't use SparkConf.set(), but the other two options work fine. You can't do System.setProperty() in cluster mode to achieve that either, so even

[GitHub] spark pull request: [SPARK-1177] Allow SPARK_JAR to be set program...

2014-06-05 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/987#issuecomment-45293874 Build finished. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this

[GitHub] spark pull request: [SPARK-1177] Allow SPARK_JAR to be set program...

2014-06-05 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/987#issuecomment-45293876 Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/15489/ --- If your project is set up for it, you can

[GitHub] spark pull request: [SPARK-1177] Allow SPARK_JAR to be set program...

2014-06-05 Thread dbtsai
Github user dbtsai commented on the pull request: https://github.com/apache/spark/pull/987#issuecomment-45296471 We lunched Spark job inside our tomcat, and we directly use Client.scala API. With my patch, I can setup the spark jar using System.setProperty() before val