Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2795#issuecomment-59021297
Jenkins, test this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2784#issuecomment-59013875
Hey Aaron,
I increased the interval because its any way a "noise" !, We don't intend
to use the akka's Failure Detector because we hav
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2520#discussion_r18816759
--- Diff: project/SparkBuild.scala ---
@@ -170,6 +178,24 @@ object SparkBuild extends PomBuild {
}
+object YARNCommon {
+ lazy
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2782#issuecomment-59008076
Hey Patrick, You are right about that. We can make TaskContext an interface
if we only allow TaskContextHelper.get() instead of TaskContext.get(). And then
maybe I
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2782#issuecomment-5896
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
GitHub user ScrapCodes opened a pull request:
https://github.com/apache/spark/pull/2782
SPARK-3874, Provide stable TaskContext API
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/ScrapCodes/spark-1 SPARK-3874/stable-tc
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2673#issuecomment-58851300
@pwendell I don't see an easy way with maven shade plugin either ? Do you
?, One way is to include a fake dependency and then ask it to shade that across
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2673#issuecomment-58631368
@pwendell I tried maven shade plugin to somehow work as effective pom
generator, but that does not happen unless we have dependencies apart from the
project's i
GitHub user ScrapCodes opened a pull request:
https://github.com/apache/spark/pull/2750
[SPARK-2924] Required by scala 2.11, only one function/ctor amongst over...
...riden alternatives, can have default argument.
You can merge this pull request into a Git repository by running
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/1685#discussion_r18629324
--- Diff:
streaming/src/test/scala/org/apache/spark/streaming/InputStreamsSuite.scala ---
@@ -144,59 +142,6 @@ class InputStreamsSuite extends
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2520#discussion_r18574479
--- Diff: project/SparkBuild.scala ---
@@ -170,6 +178,24 @@ object SparkBuild extends PomBuild {
}
+object YARNCommon {
+ lazy
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2520#issuecomment-58328853
I did not check up on maven part, it looks okay. Apart from two comments
above, LGTM.
---
If your project is set up for it, you can reply to this email and have
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2520#discussion_r18570223
--- Diff: project/SparkBuild.scala ---
@@ -170,6 +178,24 @@ object SparkBuild extends PomBuild {
}
+object YARNCommon {
+ lazy
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2520#discussion_r18570229
--- Diff: project/SparkBuild.scala ---
@@ -170,6 +178,24 @@ object SparkBuild extends PomBuild {
}
+object YARNCommon {
+ lazy
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2673#issuecomment-58138496
Well, sorry for the confusion. For the published bits used now one can
easily exclude and include another dependency as long as it is about a few
deps. But once we
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2673#issuecomment-58005087
I will have to add a similar thing for
http://maven.apache.org/plugins/maven-deploy-plugin/deploy-file-mojo.html. But
I am not sure about repository url field
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2673#issuecomment-58004659
@pwendell Take a look, whenever you get time. It would be good if we can
publish https://github.com/ScrapCodes/effective-pom-plugin.
---
If your project is set up
GitHub user ScrapCodes opened a pull request:
https://github.com/apache/spark/pull/2673
Build changes to publish effective pom.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/ScrapCodes/spark-1 build-changes-effective-pom
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/1685#issuecomment-57979326
LGTM, I have tested it locally by running test suits(only relevant ones.)
@pwendell Can you trigger jenkins here and should be okay to merge ?
---
If your project is
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/1685#issuecomment-57976544
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2520#issuecomment-5790
Hey , I will check this patch very soon. I have an impression that these
changes to SparkBuild are not needed. Even if they are needed then something
needs fixed in
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2615#issuecomment-57597737
Ahh wait, I would still need to alter maven-install-plugin. Because, there
has to be someway to tell install plugin that it has to install at location
which has _2.10
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2615#issuecomment-57597447
I think you mean to try `replaceFile` when you said maven-shade-plugin. Let
me try that as well.
---
If your project is set up for it, you can reply to this email
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2615#issuecomment-57593822
So now to try this patch just `mvn install - [this
plugin](https://github.com/ScrapCodes/scala-install-plugin). After that look at
the published poms.
---
If your
GitHub user ScrapCodes reopened a pull request:
https://github.com/apache/spark/pull/2357
[SPARK-3437][BUILD] Support crossbuilding in maven. With new
scala-install-plugin.
Since this plugin is not deployed anywhere, for anyone trying this patch
has to publish it locally by
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2615#issuecomment-57590353
And this https://github.com/ScrapCodes/scala-install-plugin plugin takes
care of publishing correct poms too.
---
If your project is set up for it, you can reply to
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2615#issuecomment-57590286
Hey Patrick, thanks for looking at this. I did not say it is not possible.
I just said the best(easiest ) way I could come up was to modify the maven
install plugin
GitHub user ScrapCodes opened a pull request:
https://github.com/apache/spark/pull/2615
Adjust build system and tests to work with scala 2.11+ repl port.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/ScrapCodes/spark-1 scala
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/771#discussion_r18204443
--- Diff:
core/src/main/scala/org/apache/spark/deploy/master/FileSystemPersistenceEngine.scala
---
@@ -79,11 +80,9 @@ private[spark] class
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/771#issuecomment-57271768
Because they were private spark. It is very inconvenient for someone to
write his/her own recovery mode with all that private spark. + This felt like
developer facing
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2158#issuecomment-57126791
@marmbrus Mind taking a look ?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user ScrapCodes closed the pull request at:
https://github.com/apache/spark/pull/2357
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2357#issuecomment-57126589
Turns out this is not very useful either.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user ScrapCodes closed the pull request at:
https://github.com/apache/spark/pull/2318
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2425#discussion_r18083933
--- Diff: core/src/test/scala/org/apache/spark/CacheManagerSuite.scala ---
@@ -94,7 +94,7 @@ class CacheManagerSuite extends FunSuite with
BeforeAndAfter
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2294#issuecomment-56118218
Can you rebase to the tip of master and add tests in version 1.2 section:
https://github.com/apache/spark/blob/master/project/MimaExcludes.scala#L37
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2403#issuecomment-56015430
I guess we will have to exclude those in MimaExcludes.scala. If this
happens too often, then *may be* we can ignore toString for all classes
possible.
---
If your
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2425#discussion_r17713594
--- Diff: core/src/main/java/org/apache/spark/TaskContext.java ---
@@ -0,0 +1,234 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2425#discussion_r17713233
--- Diff: core/src/main/java/org/apache/spark/TaskContext.java ---
@@ -0,0 +1,234 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2425#discussion_r17713126
--- Diff: core/src/main/java/org/apache/spark/TaskContext.java ---
@@ -0,0 +1,234 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2425#discussion_r17712697
--- Diff: core/src/main/java/org/apache/spark/TaskContext.java ---
@@ -0,0 +1,234 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under
Github user ScrapCodes closed the pull request at:
https://github.com/apache/spark/pull/1688
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2426#issuecomment-55869795
There is nothing like spark-env.cmd in the code base ? + Emacs is my editor
of choice too and I add those excludes in gitignore global simply because I can
not go and
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2426#issuecomment-55868873
You can have global gitignore.
https://help.github.com/articles/ignoring-files. I am not sure how many editors
and such we are going to support. Mind closing this PR
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2425#issuecomment-55867575
@rxin One side question, Java8ApiSuite(s) don't compile, looks like we have
been overlooking them for a while. May be we could just remove them ?
---
If your pr
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2425#discussion_r17653450
--- Diff: core/src/main/java/org/apache/spark/TaskContext.java ---
@@ -0,0 +1,176 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under
GitHub user ScrapCodes opened a pull request:
https://github.com/apache/spark/pull/2425
[SPARK-3543] Write TaskContext in Java and expose it through a static
accessor.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/ScrapCodes
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2404#issuecomment-55693512
So I guess it came from the commit,
https://github.com/apache/spark/commit/da33acb8b681eca5e787d546fe922af76a151398.
But this seems to be present in your branch. So
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2404#issuecomment-55693199
It happens because in the git diff the script compares the PR branch with
master and if PR is not rebased to the tip of master. False reporting will
happen.
---
If
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2285#issuecomment-55687221
Yes, it should fix this problem.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2357#issuecomment-55571483
Hi @srowen, like I said. We can run a plugin before maven install - no
question about that. But since maven install gets a copy of STATE(somehoe via
guice). Altering
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/449#discussion_r17530603
--- Diff: bin/run-example ---
@@ -21,18 +21,25 @@ SCALA_VERSION=2.10
FWDIR="$(cd `dirname $0`/..; pwd)"
export SPARK_HO
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2387#issuecomment-55563583
This is also a duplicate, of #2301.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2386#discussion_r17530450
--- Diff: bin/spark-shell ---
@@ -29,7 +29,7 @@ esac
set -o posix
## Global script variables
-FWDIR="$(cd "`dirname &quo
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2386#discussion_r17530408
--- Diff: bin/spark-shell ---
@@ -29,7 +29,7 @@ esac
set -o posix
## Global script variables
-FWDIR="$(cd "`dirname &quo
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2358#issuecomment-55560989
Yeah Thanks, @ash211 I will get rid of that commit.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2357#issuecomment-9135
I am anyway trying more options to not need to modify maven-install-pluing.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2357#issuecomment-6738
you are right about that, but we are forking it because we want an install
plugin modified. If it was possible to run a plugin just before install and
install plugin
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2358#issuecomment-6582
Hey @pwendell, I can remove the commit once you confirm it works.
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2357#issuecomment-55362622
It will not conflict with that, it works the same way as maven install
plugin. (accidentally deleted my previous post. :/)
---
If your project is set up for it, you
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2357#issuecomment-55362572
I will not conflict with that. One can simply replace maven-install-plugin
with this one and works the exact same way.
---
If your project is set up for it, you can
GitHub user ScrapCodes opened a pull request:
https://github.com/apache/spark/pull/2358
[SPARK-2182] Scalastyle rule blocking (non keyboard typeable) unicode op...
...erators.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2318#issuecomment-55244394
#2357 Does this with the help of a custom maven plugin.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
GitHub user ScrapCodes opened a pull request:
https://github.com/apache/spark/pull/2357
[SPARK-3437][BUILD] Support crossbuilding in maven. With new
scala-install-plugin.
Since this plugin is not deployed anywhere, for anyone trying this patch
has to publish it locally by cloning
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2285#issuecomment-55077314
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2318#issuecomment-55075420
Hey All,
Please correct me if I am wrong, my conclusion is that this also will not
help because pom published is not appropriate(Since it does not have dependency
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2329#issuecomment-55074587
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
GitHub user ScrapCodes opened a pull request:
https://github.com/apache/spark/pull/2331
Minor - Fix trivial compilation warnings.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/ScrapCodes/spark-1 compilation-warn
Alternatively
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2329#issuecomment-54933430
test this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
GitHub user ScrapCodes opened a pull request:
https://github.com/apache/spark/pull/2329
[SPARK-3452] Maven build should skip publishing artifacts people shouldn...
...'t depend on
Publish local in maven term is `install`
and publish otherwise is `d
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2318#discussion_r17282239
--- Diff: assembly/pom.xml ---
@@ -26,7 +26,7 @@
org.apache.spark
- spark-assembly_2.10
+ spark-assembly
--- End diff
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2229#issuecomment-54800273
This patch is like most prone to merge conflicts, would be good to merge it
soon.
---
If your project is set up for it, you can reply to this email and have your
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2285#issuecomment-54797397
I will let @JoshRosen take look too, since he reported this issue as well.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2229#issuecomment-54794153
test this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2318#issuecomment-54792890
@pwendell If this is okay to do, then there is no need of a special maven
plugin for this task.
---
If your project is set up for it, you can reply to this email and
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2317#issuecomment-54792700
Yay merged it !
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
GitHub user ScrapCodes opened a pull request:
https://github.com/apache/spark/pull/2318
[SPARK-3437] Support crossbuilding in maven.
This patch get rid of _2.10 in the artifact ids and adapts the install
plugin to include it instead.
This will make it easy to cross build for
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2285#issuecomment-54783013
@rxin The test failed as expected. I will fix it once you confirm the code
change.
---
If your project is set up for it, you can reply to this email and have your
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2317#issuecomment-54781961
@JoshRosen You will have to merge, (since I have never merged a patch
before.) Jenkins, retest this please.
---
If your project is set up for it, you can reply to
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2317#issuecomment-54777992
@JoshRosen please take a look ? I will take care of removing false
positives in my other PR. Once this is merged.
---
If your project is set up for it, you can reply
GitHub user ScrapCodes opened a pull request:
https://github.com/apache/spark/pull/2317
[HOTFIX] for hotfixes, a left over version change. It should make mima
happy.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/ScrapCodes
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2285#issuecomment-54777281
test this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2311#issuecomment-54777212
You will have to rebase your patch for tests to pass.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2055#issuecomment-54776785
@pwendell This patch will lead to logging of messages like
```
14/09/08 10:23:39.639 WARN AppClient$ClientActor: Received unexpected actor
system event
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2285#discussion_r17162225
--- Diff: dev/mima ---
@@ -25,11 +25,15 @@ FWDIR="$(cd `dirname $0`/..; pwd)"
cd "$FWDIR"
echo -e "q\n" | sbt
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2285#discussion_r17161635
--- Diff: project/SparkBuild.scala ---
@@ -187,7 +187,7 @@ object OldDeps {
Some("org.apache.spark" % fullId % "1.0.0")
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2285#discussion_r17161831
--- Diff: dev/mima ---
@@ -25,12 +25,16 @@ FWDIR="$(cd `dirname $0`/..; pwd)"
cd "$FWDIR"
echo -e "q\n" | sbt
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2285#discussion_r17161813
--- Diff: dev/mima ---
@@ -25,12 +25,16 @@ FWDIR="$(cd `dirname $0`/..; pwd)"
cd "$FWDIR"
echo -e "q\n" | sbt
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2194#issuecomment-54598665
@rxin There is a reason and (workaround type of)fix for this on #2285.
---
If your project is set up for it, you can reply to this email and have your
reply appear on
GitHub user ScrapCodes opened a pull request:
https://github.com/apache/spark/pull/2285
Fix for false positives reported by mima on PR 2194.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/ScrapCodes/spark-1 mima-fix
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2229#discussion_r17158127
--- Diff: sbt/sbt-launch-lib.bash ---
@@ -180,7 +180,7 @@ run() {
${SBT_OPTS:-$default_sbt_opts} \
$(get_mem_opts $sbt_mem
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2194#issuecomment-54465038
I am looking at this. Mima check should have excluded those methods.
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2229#discussion_r17097618
--- Diff: sbt/sbt-launch-lib.bash ---
@@ -190,5 +190,5 @@ runAlternateBoot() {
local bootpropsfile="$1"
shift
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2229#discussion_r17097578
--- Diff: bin/compute-classpath.sh ---
@@ -63,7 +63,7 @@ else
assembly_folder="$ASSEMBLY_DIR"
fi
-num_jars=$(ls "$
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2229#issuecomment-54272480
retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/2204#discussion_r17032605
--- Diff:
yarn/common/src/test/scala/org/apache/spark/deploy/yarn/ClientBaseSuite.scala
---
@@ -232,6 +233,15 @@ class ClientBaseSuite extends FunSuite
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2229#issuecomment-54126820
Tested it by have spark directory as "Apache Spark". all scripts seemed to
work.
---
If your project is set up for it, you can reply to this email and
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/2234#issuecomment-54125511
@WangTaoTheTonic Yeah, Mind closing this ?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/1905#discussion_r16975033
--- Diff: dev/change-version-to-2.11.sh ---
@@ -0,0 +1,20 @@
+#!/usr/bin/env bash
+
+#
+# Licensed to the Apache Software Foundation (ASF
Github user ScrapCodes commented on the pull request:
https://github.com/apache/spark/pull/845#issuecomment-54099382
@nikhils05 Can you close this PR ?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
401 - 500 of 719 matches
Mail list logo