Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1685#issuecomment-53904400
I guess someone needs to publish the shaded jars first?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1905#issuecomment-52544191
@ScrapCodes , some questions:
- How do you plan to resolve the protobuf version requirement/conflict
(akka 2.3 specifically requires protobuf 2.5, hadoop1
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1704#issuecomment-52127666
Jenkins, test this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1704#issuecomment-51942842
Had conflict with recent merge of 9038d9. Addressed other code comments as
well.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1704#issuecomment-51706912
The latest patches merge cleanly now
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1704#issuecomment-51706733
Had conflict with recent commit 74d6f622
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1704#issuecomment-51700285
Added, please restart the test.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1704#issuecomment-51570698
@ScrapCodes ok I can do the change and re-submit. But just to be clear -
that will still be breaking binary compatibility.
---
If your project is set up for it, you can
Github user avati commented on a diff in the pull request:
https://github.com/apache/spark/pull/1704#discussion_r15980530
--- Diff: core/src/main/scala/org/apache/spark/ui/JettyUtils.scala ---
@@ -85,15 +85,28 @@ private[spark] object JettyUtils extends Logging {
path
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1704#issuecomment-51570315
@ScrapCodes - we have to break abi compatibility (of some api) no matter
how we fix this. so isn't it better to eliminate all default params just for
better hy
Github user avati commented on a diff in the pull request:
https://github.com/apache/spark/pull/1704#discussion_r15980421
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/StreamingContext.scala ---
@@ -85,22 +85,46 @@ class StreamingContext private[streaming
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1813#issuecomment-51549053
shade plug-in works in many cases but not in all. From what I understand,
if the dependency is direct, shade plugin works fine. In case of protobuf it
was indirect
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1685#issuecomment-51298746
The latest updated patches depend on published packages built by
https://github.com/avati/spark-shaded scripts
---
If your project is set up for it, you can reply to this
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1702#issuecomment-51261806
ping.
Looks like a retest is needed here?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1704#issuecomment-51261696
If the API is not "too critical" it might be worth pushing the change and
be done, by breaking the api. Any other option will have a burden of some sort
to be ca
Github user avati commented on a diff in the pull request:
https://github.com/apache/spark/pull/996#discussion_r15774561
--- Diff: assembly/pom.xml ---
@@ -26,7 +26,7 @@
org.apache.spark
- spark-assembly_2.10
+ spark-assembly_${scala.binary.version
Github user avati commented on a diff in the pull request:
https://github.com/apache/spark/pull/996#discussion_r15770017
--- Diff: assembly/pom.xml ---
@@ -26,7 +26,7 @@
org.apache.spark
- spark-assembly_2.10
+ spark-assembly_${scala.binary.version
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1702#issuecomment-51092475
It is not clear how the failure is related to this patch..?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1702#issuecomment-51012850
PING
Any update on this PR?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
GitHub user avati opened a pull request:
https://github.com/apache/spark/pull/1749
[SPARK-1997] mllib - upgrade to breeze 0.8.1
Signed-off-by: Anand Avati
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/avati/spark SPARK-1997
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1704#issuecomment-50953179
On Fri, Aug 1, 2014 at 6:52 PM, Patrick Wendell
wrote:
> Is this a comprehensive list of cases that need to be addressed? One issue
> is that thi
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1709#issuecomment-50941761
The failures seem to be odd and unrelated to the patch itself.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1709#issuecomment-50926464
@marmbrus - this latest patch fixes the Scala 2.11 compilation breakage
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1709#issuecomment-50924789
Yeah, I was thinking:
def notNull: AttributeReference = a.withNullability(false)
---
If your project is set up for it, you can reply to this email and have
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1709#issuecomment-50923602
So, it looks like with:
a.withNullability (in
https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/dsl
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1709#issuecomment-50919584
> It is possibly a change with type inference / implicit conversions?
>
Likely. What is the expected type which is supposed to have 'def at()',
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1709#issuecomment-50919009
This is the exact failure. The error is actually missing at() method
encountered in Scala 2.11
[ERROR]
/Users/avati/work/spark/sql/catalyst/src/test/scala/org
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1709#issuecomment-50918557
> Mind closing this PR if its not an issue then? Thanks!
>
The description of the cause (scala.NotNull trait removal in 2.11) might be
wrong, but
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/940#issuecomment-50910905
> I reverted the change in #1718 <https://github.com/apache/spark/pull/1718>
> and asked @marmbrus <https://github.com/marmbrus> to take look at th
Github user avati closed the pull request at:
https://github.com/apache/spark/pull/1708
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1708#issuecomment-50909364
#1704 fixed. Closing this.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/940#issuecomment-50899294
> I think Jenkins should be fine but the assembly jar is broken. Is it
right?
>
I think so, just like commons-math3
---
If your project is set up for i
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/940#issuecomment-50897230
Yes, either #1701 or #1369. We are already broken till they are committed.
On Fri, Aug 1, 2014 at 8:19 AM, Xiangrui Meng
wrote:
> Ah, I see. Te
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1708#issuecomment-50896977
Actually this is related to #1704. There are too many patches in my working
tree and I failed to see the link. #1704 caused this "problem".
I need to cre
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/940#issuecomment-50894442
@mengxr Based on the dependency graph, I am guessing we will now have jar
hell problem with scalalogging-slf4j 2.1.2 (needed by breeze 0.8.1) vs 1.0.1
(used by sql
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1703#issuecomment-50893549
Thanks @mengxr
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user avati closed the pull request at:
https://github.com/apache/spark/pull/1703
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user avati commented on a diff in the pull request:
https://github.com/apache/spark/pull/1701#discussion_r15683518
--- Diff: sql/core/pom.xml ---
@@ -83,6 +83,16 @@
scalacheck_${scala.binary.version}
test
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/940#issuecomment-50850842
> But is it needed for the v1.1 release? Spark v1.1 doesn't support Scala
> 2.11.
>
Not, I guess. I din't realize Spark 1.1 was not yet
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/940#issuecomment-50850441
> Yes, it is already a problem with breeze 0.7. But we didn't realized that
> hadoop 2.3 depends on commons-math3 in the Spark v1.0 release. If there is
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1709#issuecomment-50850247
> These tests shouldn't be using scala NotNull, but catalyst's .notNull
>
<https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/o
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/940#issuecomment-50849972
@mengxr looking at the dependency graphs of breeze 0.7 and 0.8.1, it
appears that both the versions are depending on commons-math3:3.2. If hadoop
2.3 and 2.4 depend on
GitHub user avati opened a pull request:
https://github.com/apache/spark/pull/1711
[SPARK-1812] upgrade to scala-maven-plugin 3.2.0
Needed for Scala 2.11 compiler-interface
Signed-off-by: Anand Avati
You can merge this pull request into a Git repository by running
GitHub user avati opened a pull request:
https://github.com/apache/spark/pull/1709
[SPARK-1812] sql/catalyst - remove scala.NotNull related tests
scala.NotNull Trait is removed in 2.11
Signed-off-by: Anand Avati
You can merge this pull request into a Git repository by
GitHub user avati opened a pull request:
https://github.com/apache/spark/pull/1708
[SPARK-1812] streaming - fix parameters to StreamingContext constructor
Encountered while compiling against Scala 2.11, possibly unrelated to Scala
version (and is an existing issue)
Signed
Github user avati commented on a diff in the pull request:
https://github.com/apache/spark/pull/996#discussion_r15679675
--- Diff: assembly/pom.xml ---
@@ -26,7 +26,7 @@
org.apache.spark
- spark-assembly_2.10
+ spark-assembly_${scala.binary.version
Github user avati closed the pull request at:
https://github.com/apache/spark/pull/1649
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1649#issuecomment-50833567
Splitting into multiple smaller PRs. Closing.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
GitHub user avati opened a pull request:
https://github.com/apache/spark/pull/1704
[SPARK-1812] remove default args to overloaded methods
Not supported in Scala 2.11. Split them into separate methods instead.
You can merge this pull request into a Git repository by running
GitHub user avati opened a pull request:
https://github.com/apache/spark/pull/1703
[SPARK-1812] mllib - upgrade to breeze 0.8.1
Scala 2.11 packages not available for the current version (0.7)
Signed-off-by: Anand Avati
You can merge this pull request into a Git repository
GitHub user avati opened a pull request:
https://github.com/apache/spark/pull/1702
[SPARK-1812] core - upgrade to json4s-jackson 3.2.10
Scala 2.11 packages not available for the current version (3.2.6)
Signed-off-by: Anand Avati
You can merge this pull request into a Git
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1685#issuecomment-50829945
On Thu, Jul 31, 2014 at 1:50 AM, Sean Owen wrote:
> This is the bit that can't be committed now, right? is there a subset of
> the update that works
GitHub user avati opened a pull request:
https://github.com/apache/spark/pull/1701
SPARK-1812: upgrade dependency to scala-logging 2.1.2
Scala 2.11 packages not available for the current version (1.0.1)
Signed-off-by: Anand Avati
You can merge this pull request into a Git
GitHub user avati opened a pull request:
https://github.com/apache/spark/pull/1685
[SPARK-1812] akka 2.3.4
Upgrade to akka 2.3.4
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/avati/spark SPARK-1812-akka-2.3
Alternatively you
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1649#issuecomment-50720340
I will split this up into multiple PRs.
I will start with a small pull request for akka upgrade to 2.3. However as
Sean mentioned, I guess we are stuck on the
Github user avati commented on a diff in the pull request:
https://github.com/apache/spark/pull/1649#discussion_r15598196
--- Diff: pom.xml ---
@@ -1015,7 +1016,6 @@
hadoop-2.2
2.2.0
-2.5.0
--- End diff --
I was not aware
Github user avati commented on a diff in the pull request:
https://github.com/apache/spark/pull/1649#discussion_r15596377
--- Diff: pom.xml ---
@@ -1015,7 +1016,6 @@
hadoop-2.2
2.2.0
-2.5.0
--- End diff --
I'm not
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/1649#issuecomment-50643077
On Wed, Jul 30, 2014 at 3:14 AM, Sean Owen wrote:
> I wonder if you can pull out the changes that will also work in Scala
> 2.10? some of it looks lik
Github user avati commented on a diff in the pull request:
https://github.com/apache/spark/pull/1649#discussion_r15595520
--- Diff: pom.xml ---
@@ -1015,7 +1016,6 @@
hadoop-2.2
2.2.0
-2.5.0
--- End diff --
On Wed, Jul 30
GitHub user avati opened a pull request:
https://github.com/apache/spark/pull/1649
[SPARK-1812] [WIP] Scala 2.11 support
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/avati/spark scala-2.11
Alternatively you can review and
Github user avati commented on the pull request:
https://github.com/apache/spark/pull/996#issuecomment-50439713
@ScrapCodes @mateiz looks like there is some parallel efforts here
(github.com/avati/spark/commits/scala-2.11). It is true some upstream artifacts
are pending (from other
61 matches
Mail list logo