[GitHub] spark pull request #19102: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

2017-09-05 Thread lgrcyanny
Github user lgrcyanny closed the pull request at:

https://github.com/apache/spark/pull/19102


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

2017-09-03 Thread lgrcyanny
Github user lgrcyanny closed the pull request at:

https://github.com/apache/spark/pull/19079


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19102: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

2017-09-03 Thread lgrcyanny
Github user lgrcyanny commented on a diff in the pull request:

https://github.com/apache/spark/pull/19102#discussion_r136738525
  
--- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
@@ -496,7 +496,7 @@ object SparkSubmit extends CommandLineUtils with 
Logging {
 sysProp = "spark.executor.memory"),
   OptionAssigner(args.totalExecutorCores, STANDALONE | MESOS, 
ALL_DEPLOY_MODES,
 sysProp = "spark.cores.max"),
-  OptionAssigner(args.files, LOCAL | STANDALONE | MESOS, 
ALL_DEPLOY_MODES,
+  OptionAssigner(args.files, ALL_CLUSTER_MGRS, ALL_DEPLOY_MODES,
--- End diff --

For yarn-client mode, --files are are already added to 
"spark.yarn.dist.files", I agree with you that, just addFile in SparkContext 
for yarn-client mode for "spark.yarn.dist.files". BTW, I will fix the doc for 
Spark-Submit either.
Thanks @vanzin 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on driver ...

2017-08-31 Thread lgrcyanny
Github user lgrcyanny commented on the issue:

https://github.com/apache/spark/pull/19079
  
Hi @vanzin I have submit a PR based on master branch, please review it, 
thank you
https://github.com/apache/spark/pull/19102


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19102: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

2017-08-31 Thread lgrcyanny
GitHub user lgrcyanny opened a pull request:

https://github.com/apache/spark/pull/19102

[SPARK-21859][CORE] Fix SparkFiles.get failed on driver in yarn-cluster and 
yarn-client mode

## What changes were proposed in this pull request?
when use SparkFiles.get a file on driver in yarn-client or yarn-cluster, it 
will report file not found exception.
This exception only happens on driver, SparkFiles.get on executor works 
fine.
we can reproduce the bug as follows:
```scala
val conf = new SparkConf().setAppName("SparkFilesTest")
val sc = new SparkContext(conf)
def testOnDriver(fileName: String) = {
val file = new File(SparkFiles.get(fileName))
if (!file.exists()) {
println(s"$file not exist")
} else {
// print file content on driver
val content = Source.fromFile(file).getLines().mkString("\n")
println(s"File content: ${content}")
}
}
// the output will be file not exist
```

```python
conf = SparkConf().setAppName("test files")
sc = SparkContext(appName="spark files test")
def test_on_driver(filename):
file = SparkFiles.get(filename)
print("file path: {}".format(file))
if os.path.exists(file):
with open(file) as f:
lines = f.readlines()
print(lines)
else:
print("file doesn't exist")
run_command("ls .")
```
the output will be file not exist

## How was this patch tested?
tested in integration tests and manual tests
submit the demo case in yarn-cluster and yarn-client mode, and verify the 
test result

```
./bin/spark-submit --master yarn-cluster --files README.md --class 
"testing.SparkFilesTest" testing.jar
./bin/spark-submit --master yarn-client --files README.md --class 
"testing.SparkFilesTest" testing.jar
./bin/spark-submit --master yarn-cluster --files README.md test_get_files.py
./bin/spark-submit --master yarn-client --files README.md test_get_files.py
```


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/lgrcyanny/spark fix-spark-yarn-files-master

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/19102.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #19102


commit f2c06ffe4649589d2d175fc3dc61e00170c20a94
Author: lgrcyanny 
Date:   2017-05-07T12:51:55Z

[SPARK-21859][CORE] Fix SparkFiles.get failed on driver in yarn-cluster and 
yarn-client mode

when use SparkFiles.get a file on driver in yarn-client or yarn-cluster, it 
will report file not found exception.
This exception only happens on driver, SparkFiles.get on executor works 
fine.
we can reproduce the bug as follows:
```scala
val conf = new SparkConf().setAppName("SparkFilesTest")
val sc = new SparkContext(conf)
def testOnDriver(fileName: String) = {
val file = new File(SparkFiles.get(fileName))
if (!file.exists()) {
println(s"$file not exist")
} else {
// print file content on driver
val content = Source.fromFile(file).getLines().mkString("\n")
println(s"File content: ${content}")
}
}
// the output will be file not exist
```

```python
conf = SparkConf().setAppName("test files")
sc = SparkContext(appName="spark files test")
def test_on_driver(filename):
file = SparkFiles.get(filename)
print("file path: {}".format(file))
if os.path.exists(file):
with open(file) as f:
lines = f.readlines()
print(lines)
else:
print("file doesn't exist")
run_command("ls .")
```
the output will be file not exist

tested in integration tests and manual tests
submit the demo case in yarn-cluster and yarn-client mode, and verify the 
test result

```
./bin/spark-submit --master yarn-cluster --files README.md --class 
"testing.SparkFilesTest" testing.jar
./bin/spark-submit --master yarn-client --files README.md --class 
"testing.SparkFilesTest" testing.jar
./bin/spark-submit --master yarn-cluster --files README.md test_get_files.py
./bin/spark-submit --master yarn-client --files README.md test_get_files.py
```




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contac

[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

2017-08-30 Thread lgrcyanny
Github user lgrcyanny commented on a diff in the pull request:

https://github.com/apache/spark/pull/19079#discussion_r136056972
  
--- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
@@ -481,7 +481,7 @@ object SparkSubmit extends CommandLineUtils {
 sysProp = "spark.executor.memory"),
   OptionAssigner(args.totalExecutorCores, STANDALONE | MESOS, 
ALL_DEPLOY_MODES,
 sysProp = "spark.cores.max"),
-  OptionAssigner(args.files, LOCAL | STANDALONE | MESOS, 
ALL_DEPLOY_MODES,
+  OptionAssigner(args.files, ALL_CLUSTER_MGRS, ALL_DEPLOY_MODES,
--- End diff --

May I ask, why the OptionAssigner for "spark.files" works for local, 
standalone and mesos, only except yarn? is there any doc explain the design 
purpose? or may be this is really a issue.
```
OptionAssigner(args.files, LOCAL | STANDALONE | MESOS, ALL_DEPLOY_MODES, 
"spark.files")
```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

2017-08-30 Thread lgrcyanny
Github user lgrcyanny commented on a diff in the pull request:

https://github.com/apache/spark/pull/19079#discussion_r136055854
  
--- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
@@ -481,7 +481,7 @@ object SparkSubmit extends CommandLineUtils {
 sysProp = "spark.executor.memory"),
   OptionAssigner(args.totalExecutorCores, STANDALONE | MESOS, 
ALL_DEPLOY_MODES,
 sysProp = "spark.cores.max"),
-  OptionAssigner(args.files, LOCAL | STANDALONE | MESOS, 
ALL_DEPLOY_MODES,
+  OptionAssigner(args.files, ALL_CLUSTER_MGRS, ALL_DEPLOY_MODES,
--- End diff --

hi @jerryshao About the remote files to handle yarn-client files problem, 
is there jira that explains the design? We can wait for a version which 
resolved the problem.

I think my fix just solve the problem simply, do you have any other idea to 
solve it more elegantly?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

2017-08-29 Thread lgrcyanny
Github user lgrcyanny commented on a diff in the pull request:

https://github.com/apache/spark/pull/19079#discussion_r135968835
  
--- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
@@ -481,7 +481,7 @@ object SparkSubmit extends CommandLineUtils {
 sysProp = "spark.executor.memory"),
   OptionAssigner(args.totalExecutorCores, STANDALONE | MESOS, 
ALL_DEPLOY_MODES,
 sysProp = "spark.cores.max"),
-  OptionAssigner(args.files, LOCAL | STANDALONE | MESOS, 
ALL_DEPLOY_MODES,
+  OptionAssigner(args.files, ALL_CLUSTER_MGRS, ALL_DEPLOY_MODES,
--- End diff --

I met some users complained about the wired action about SparkFiles.get in 
yarn-client and yarn-cluster mode. SparkFiles.get is very easy for user to get 
file path. why not keep the same action in yarn-cluster and yarn-client mode?
Meanwhile, it not very easy for user to use spark.yarn.dist.files, it must 
be uploaded to HDFS in advance. To make spark on yarn more usable, use 
SparkFiles.get is better.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

2017-08-29 Thread lgrcyanny
Github user lgrcyanny commented on a diff in the pull request:

https://github.com/apache/spark/pull/19079#discussion_r135943288
  
--- Diff: core/src/main/scala/org/apache/spark/SparkEnv.scala ---
@@ -393,7 +393,7 @@ object SparkEnv extends Logging {
 // Add a reference to tmp dir created by driver, we will delete this 
tmp dir when stop() is
 // called, and we only need to do it for driver. Because driver may 
run as a service, and if we
 // don't delete this tmp dir when sc is stopped, then will create too 
many tmp dirs.
-if (isDriver) {
+if (isDriver && 
conf.getOption("spark.submit.deployMode").getOrElse("client") == "client") {
--- End diff --

Originally, my version is
```
conf.get("spark.submit.deployMode", "client") == "client"
```
Then I refered to the SparkContext#deployMode function,
it use
```
conf.getOption("spark.submit.deployMode").getOrElse("client")
```
I just want to keep the same style as SparkContext.
which one is more better?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

2017-08-29 Thread lgrcyanny
Github user lgrcyanny commented on a diff in the pull request:

https://github.com/apache/spark/pull/19079#discussion_r135942521
  
--- Diff: core/src/main/scala/org/apache/spark/SparkEnv.scala ---
@@ -393,7 +393,7 @@ object SparkEnv extends Logging {
 // Add a reference to tmp dir created by driver, we will delete this 
tmp dir when stop() is
 // called, and we only need to do it for driver. Because driver may 
run as a service, and if we
 // don't delete this tmp dir when sc is stopped, then will create too 
many tmp dirs.
-if (isDriver) {
+if (isDriver && 
conf.getOption("spark.submit.deployMode").getOrElse("client") == "client") {
--- End diff --

Ok, thanks, I will change it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19079: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

2017-08-29 Thread lgrcyanny
GitHub user lgrcyanny opened a pull request:

https://github.com/apache/spark/pull/19079

[SPARK-21859][CORE] Fix SparkFiles.get failed on driver in yarn-cluster and 
yarn-client mode

## What changes were proposed in this pull request?
when use SparkFiles.get a file on driver in yarn-client or yarn-cluster, it 
will report file not found exception.
This exception only happens on driver, SparkFiles.get on executor works 
fine.
we can reproduce the bug as follows:
```scala
val conf = new SparkConf().setAppName("SparkFilesTest")
val sc = new SparkContext(conf)
def testOnDriver(fileName: String) = {
val file = new File(SparkFiles.get(fileName))
if (!file.exists()) {
println(s"$file not exist")
} else {
// print file content on driver
val content = Source.fromFile(file).getLines().mkString("\n")
println(s"File content: ${content}")
}
}
// the output will be file not exist
```

```python
conf = SparkConf().setAppName("test files")
sc = SparkContext(appName="spark files test")
def test_on_driver(filename):
file = SparkFiles.get(filename)
print("file path: {}".format(file))
if os.path.exists(file):
with open(file) as f:
lines = f.readlines()
print(lines)
else:
print("file doesn't exist")
run_command("ls .")
```
the output will be file not exist

## How was this patch tested?

tested in integration tests and manual tests
submit the demo case in yarn-cluster and yarn-client mode, and verify the 
test result
the testing  commands are:
```
./bin/spark-submit --master yarn-cluster --files README.md --class 
"testing.SparkFilesTest" testing.jar
./bin/spark-submit --master yarn-client --files README.md --class 
"testing.SparkFilesTest" testing.jar
./bin/spark-submit --master yarn-cluster --files README.md test_get_files.py
./bin/spark-submit --master yarn-client --files README.md test_get_files.py
```


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/lgrcyanny/spark fix-yarn-files-problem

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/19079.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #19079


commit 3f0e4a88bdb7156b5db7cfb56cd079d4b0de3a5b
Author: lgrcyanny 
Date:   2017-05-07T12:51:55Z

[SPARK-21859][CORE] Fix SparkFiles.get failed on driver in yarn-cluster and 
yarn-client mode

when use SparkFiles.get a file on driver in yarn-client or yarn-cluster, it 
will report file not found exception.
This exception only happens on driver, SparkFiles.get on executor works 
fine.
we can reproduce the bug as follows:
```scala
val conf = new SparkConf().setAppName("SparkFilesTest")
val sc = new SparkContext(conf)
def testOnDriver(fileName: String) = {
val file = new File(SparkFiles.get(fileName))
if (!file.exists()) {
println(s"$file not exist")
} else {
// print file content on driver
val content = Source.fromFile(file).getLines().mkString("\n")
println(s"File content: ${content}")
}
}
// the output will be file not exist
```

```python
conf = SparkConf().setAppName("test files")
sc = SparkContext(appName="spark files test")
def test_on_driver(filename):
file = SparkFiles.get(filename)
print("file path: {}".format(file))
if os.path.exists(file):
with open(file) as f:
lines = f.readlines()
print(lines)
else:
print("file doesn't exist")
run_command("ls .")
```
the output will be file not exist

tested in integration tests and manual tests
submit the demo case in yarn-cluster and yarn-client mode, and verify the 
test result

```
./bin/spark-submit --master yarn-cluster --files README.md --class 
"testing.SparkFilesTest" testing.jar
./bin/spark-submit --master yarn-client --files README.md --class 
"testing.SparkFilesTest" testing.jar
./bin/spark-submit --master yarn-cluster --files README.md test_get_files.py
./bin/spark-submit --master yarn-client --files README.md test_get_files.py
```

Change-Id: I22034f99f571a451b862c1806b7f9350c6133c95




---
If your project is set up for it, you can reply to this email and have your
reply

[GitHub] spark pull request #19076: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

2017-08-29 Thread lgrcyanny
Github user lgrcyanny closed the pull request at:

https://github.com/apache/spark/pull/19076


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19076: [SPARK-21859][CORE] Fix SparkFiles.get failed on ...

2017-08-29 Thread lgrcyanny
GitHub user lgrcyanny opened a pull request:

https://github.com/apache/spark/pull/19076

[SPARK-21859][CORE] Fix SparkFiles.get failed on driver in yarn-cluster and 
yarn-client mode

## What changes were proposed in this pull request?
when use SparkFiles.get a file on driver in yarn-client or yarn-cluster, it 
will report file not found exception.
This exception only happens on driver, getting files on executor is ok.

we can reproduce the bug as follows:
```scala
def testOnDriver(fileName: String) = {
val file = new File(SparkFiles.get(fileName))
if (!file.exists()) {
logging.info(s"$file not exist")
} else {
// print file content on driver
val content = Source.fromFile(file).getLines().mkString("\n")
logging.info(s"File content: ${content}")
}
}
// the output will be file not exist
```

```python
conf = SparkConf().setAppName("test files")
sc = SparkContext(appName="spark files test")

def test_on_driver(filename):
file = SparkFiles.get(filename)
print("file path: {}".format(file))
if os.path.exists(file):
with open(file) as f:
lines = f.readlines()
print(lines)
else:
print("file doesn't exist")
run_command("ls .")
```
## How was this patch tested?
tested in integration tests and manual tests
submit the demo case in yarn-cluster and yarn-client mode, verify the test 
result
the integration tests commands are as follows:

```shell
./bin/spark-submit --master yarn-cluster --files README.md --class 
"testing.SparkFilesTest" testing.jar
./bin/spark-submit --master yarn-client --files README.md --class 
"testing.SparkFilesTest" testing.jar
./bin/spark-submit --master yarn-cluster --files README.md test_get_files.py
./bin/spark-submit --master yarn-client --files README.md test_get_files.py
```

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/lgrcyanny/spark fix-yarn-files-problem

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/19076.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #19076


commit 5af0b9f5d0b226891410b00ab75327b61b96dcdd
Author: lgrcyanny 
Date:   2017-05-07T12:51:55Z

[SPARK-21859][CORE] Fix SparkFiles.get failed on driver in yarn-cluster and 
yarn-client mode

when use SparkFiles.get a file on driver in yarn-client or yarn-cluster, it 
will report file not found exception.
This exception only happens on driver, getting files on executor is ok.

we can reproduce the bug as follows:
```scala
def testOnDriver(fileName: String) = {
val file = new File(SparkFiles.get(fileName))
if (!file.exists()) {
logging.info(s"$file not exist")
} else {
// print file content on driver
val content = Source.fromFile(file).getLines().mkString("\n")
logging.info(s"File content: ${content}")
}
}
// the output will be file not exist
```

```python
conf = SparkConf().setAppName("test files")
sc = SparkContext(appName="spark files test")

def test_on_driver(filename):
file = SparkFiles.get(filename)
print("file path: {}".format(file))
if os.path.exists(file):
with open(file) as f:
lines = f.readlines()
print(lines)
else:
print("file doesn't exist")
run_command("ls .")
```

submit the demo case in yarn-cluster and yarn-client mode, verify the test 
result

```
./bin/spark-submit --master yarn-cluster --files README.md --class 
"testing.SparkFilesTest" testing.jar
./bin/spark-submit --master yarn-client --files README.md --class 
"testing.SparkFilesTest" testing.jar
./bin/spark-submit --master yarn-cluster --files README.md test_get_files.py
./bin/spark-submit --master yarn-client --files README.md test_get_files.py
```

Change-Id: Ice7d43fc5ac18fbc229911533d06063ea1f17c5b




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org