GitHub user tgravescs opened a pull request:
https://github.com/apache/spark/pull/128
[SPARK-1198] Allow pipes tasks to run in different sub-directories
This works as is on Linux/Mac/etc but doesn't cover working on Windows. In
here I use ln -sf for symlinks. Putting this up for comments on that. Do we
want to create perhaps some classes for doing shell commands - Linux vs
Windows. Is there some other way we want to do this? I assume we are still
supporting jdk1.6?
Also should I update the Java API for pipes to allow this parameter?
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/tgravescs/spark SPARK1198
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/128.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #128
----
commit 1ab49ca90b7cae82efa26e018d9d285c948bf25c
Author: Thomas Graves <[email protected]>
Date: 2014-03-12T14:11:46Z
Add support for running pipe tasks is separate directories
commit 6b783bdb5e09b7c96cbb76111876fbb6c9ca9a6f
Author: Thomas Graves <[email protected]>
Date: 2014-03-12T14:47:13Z
style fixes
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---