[GitHub] spark pull request: SPARK-1588. Restore SPARK_YARN_USER_ENV and SP...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/586#issuecomment-41645419 Merged build triggered. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] spark pull request: SPARK-1588. Restore SPARK_YARN_USER_ENV and SP...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/586#issuecomment-41645712 Merged build started. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] spark pull request: SPARK-1588. Restore SPARK_YARN_USER_ENV and SP...
Github user sryza commented on the pull request: https://github.com/apache/spark/pull/586#issuecomment-41645734 Uploaded a patch that follows your comments. I tested it against YARN and verified that the opts and environment variables both show up. Right now when I try running spark-shell with your Java opt string, I get a java.lang.NoClassDefFoundError: App. It seems like spark-class isn't able to handle it, which is at least reassuring that we're not breaking previous behavior. I haven't tried yet to think more generally about whether there are strings that would work for spark-class but not work with the Spark YARN code. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] spark pull request: SPARK-1588. Restore SPARK_YARN_USER_ENV and SP...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/586#issuecomment-41646168 Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/14558/ --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] spark pull request: SPARK-1588. Restore SPARK_YARN_USER_ENV and SP...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/586#issuecomment-41646166 Merged build finished. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] spark pull request: SPARK-1588. Restore SPARK_YARN_USER_ENV and SP...
Github user pwendell commented on the pull request: https://github.com/apache/spark/pull/586#issuecomment-41646542 @sryza so are you suggesting that `spark-class` previously didn't work if SPARK_JAVA_OPTS had a quoted option? Are you just testing against master or against before we changed the yarn configuration stuff? I'm just worried this has regressed since 0.9. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] spark pull request: SPARK-1588. Restore SPARK_YARN_USER_ENV and SP...
Github user sryza commented on the pull request: https://github.com/apache/spark/pull/586#issuecomment-41649723 This is indeed what I am suggesting. I had been testing against master, but just tried against 0.9.0 and hit the same error. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] spark pull request: SPARK-1588. Restore SPARK_YARN_USER_ENV and SP...
Github user tgravescs commented on the pull request: https://github.com/apache/spark/pull/586#issuecomment-41693222 The fix here works for me with the exception of the spark.authenticate config option which we are going to have to handle differently since it needs to be set properly before the executor can register with the driver. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] spark pull request: SPARK-1588. Restore SPARK_YARN_USER_ENV and SP...
Github user tgravescs commented on the pull request: https://github.com/apache/spark/pull/586#issuecomment-41693617 @pwendell There isn't any config that was added to replace SPARK_YARN_USER_ENV was there? I didn't see one but wanted to make sure I didn't miss it. We should file a jira to add one if not. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] spark pull request: SPARK-1588. Restore SPARK_YARN_USER_ENV and SP...
Github user pwendell commented on the pull request: https://github.com/apache/spark/pull/586#issuecomment-41699497 @sryza @tgravescs Tom - if `spark.authenticate` is true, then is it not getting set directly when the executor is launched: https://github.com/apache/spark/blob/master/yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ExecutorRunnableUtil.scala#L55 ? Is there an ordering issue here, or maybe just a bug in the way it's propagated? Right now there isn't any SparkConf equivalent to SPARK_YARN_USER_ENV, so I think we can just leave it for now. The other deployment modes don't currently allow setting arbitrary env vars. Maybe in the future we can generalize it to a conf option (and of course backwards-support), but I'm fine to leave it as-is for now. Re: quotes ah I guess the quoted stuff didn't work in 0.9.0 either. We should try to fix it for 1.1 but seems fine to leave it out in that case. We could consider using internal env vars for this when we transfer them, rather than setting system properties in JAVA_OPTS. Not totally sure what's best. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] spark pull request: SPARK-1588. Restore SPARK_YARN_USER_ENV and SP...
Github user tgravescs commented on the pull request: https://github.com/apache/spark/pull/586#issuecomment-41703827 @pwendell perhaps I'm missing something, the line you point to is for java options not for spark configs. SparkConf errors out if you try to put a -Dspark.authenticate in the java options right? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] spark pull request: SPARK-1588. Restore SPARK_YARN_USER_ENV and SP...
Github user pwendell commented on the pull request: https://github.com/apache/spark/pull/586#issuecomment-41714679 @tgravescs ah I see, you're right. I think I assumed incorrectly that the executor launcher would bundle up the options and send them over, but I don't actually see that happening anywhere. So this part of the code is actually not used: https://github.com/apache/spark/blob/df6d81425bf3b8830988288069f6863de873aee2/yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ClientBase.scala#L328 What happens is the executor is just getting its configuration from the driver when the executor launches. And that works in _most_ cases except for security, which it needs to know about before connecting. Is that right? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] spark pull request: SPARK-1588. Restore SPARK_YARN_USER_ENV and SP...
Github user pwendell commented on the pull request: https://github.com/apache/spark/pull/586#issuecomment-41715831 @tgravescs how did this used to work in YARN? Was the assumption that security `-Dspark.authentication=true` was in SPARK_JAVA_OPTS and then it just go threaded around everywhere properly? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] spark pull request: SPARK-1588. Restore SPARK_YARN_USER_ENV and SP...
Github user pwendell commented on the pull request: https://github.com/apache/spark/pull/586#issuecomment-41723829 Okay let's take the discussion about security stuff to: https://issues.apache.org/jira/browse/SPARK-1569 In the mean time, this looks good to me, so we can merge it. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] spark pull request: SPARK-1588. Restore SPARK_YARN_USER_ENV and SP...
Github user asfgit closed the pull request at: https://github.com/apache/spark/pull/586 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] spark pull request: SPARK-1588. Restore SPARK_YARN_USER_ENV and SP...
GitHub user sryza opened a pull request: https://github.com/apache/spark/pull/586 SPARK-1588. Restore SPARK_YARN_USER_ENV and SPARK_JAVA_OPTS for YARN. You can merge this pull request into a Git repository by running: $ git pull https://github.com/sryza/spark sandy-spark-1588 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/586.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #586 commit b3616847bded125bbec7aebe3e67e0a718dadcc2 Author: Sandy Ryza sa...@cloudera.com Date: 2014-04-29T00:54:06Z SPARK-1588. Restore SPARK_YARN_USER_ENV and SPARK_JAVA_OPTS for YARN. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---