Builds are failing

2016-02-22 Thread Iulian Dragoș
Just in case you missed this: https://issues.apache.org/jira/browse/SPARK-13431 Builds are failing with 'Method code too large' in the "shading" step with Maven. iulian -- -- Iulian Dragos -- Reactive Apps on the JVM www.typesafe.com

Re: pull request template

2016-02-19 Thread Iulian Dragoș
It's a good idea. I would add in there the spec for the PR title. I always get wrong the order between Jira and component. Moreover, CONTRIBUTING.md is also lacking them. Any reason not to add it there? I can open PRs for both, but maybe you want to keep that info on the wiki instead. iulian On

Re: build error: code too big: specialStateTransition(int, IntStream)

2016-01-28 Thread Iulian Dragoș
wrote: > After this change: > [SPARK-12681] [SQL] split IdentifiersParser.g into two files > > the biggest file under > sql/catalyst/src/main/antlr3/org/apache/spark/sql/catalyst/parser is > SparkSqlParser.g > > Maybe split SparkSqlParser.g up as well ? > > On

build error: code too big: specialStateTransition(int, IntStream)

2016-01-28 Thread Iulian Dragoș
Hi, Has anyone seen this error? The code of method specialStateTransition(int, IntStream) is exceeding the 65535 bytes limitSparkSqlParser_IdentifiersParser.java:39907 The error is in ANTLR generated files and it’s (according to Stack Overflow) due to state explosion in parser (or lexer).

Re: Unable to compile and test Spark in IntelliJ

2016-01-26 Thread Iulian Dragoș
On Tue, Jan 19, 2016 at 6:06 AM, Hyukjin Kwon wrote: > Hi all, > > I usually have been working with Spark in IntelliJ. > > Before this PR, > https://github.com/apache/spark/commit/7cd7f2202547224593517b392f56e49e4c94cabc > for > `[SPARK-12575][SQL] Grammar parity with

Re: Removing the Mesos fine-grained mode

2016-01-20 Thread Iulian Dragoș
any, but let me see if I can pull some logs > together in the next couple days. > > On Tue, Jan 19, 2016 at 10:08 AM, Iulian Dragoș < > iulian.dra...@typesafe.com> wrote: > >> It would be good to get to the bottom of this. >> >> Adam, could you share the

Re: Removing the Mesos fine-grained mode

2016-01-19 Thread Iulian Dragoș
e takes 2.7h. Again, the fine and > coarse-grained > > execution tests are on the exact same machines, exact same dataset, and > only > > changing spark.mesos.coarse to true/false. > > > > Let me know if there's anything else I can provide here. > > > > Thanks

Re: [VOTE] Release Apache Spark 1.6.0 (RC4)

2015-12-23 Thread Iulian Dragoș
+1 (non-binding) Tested Mesos deployments (client and cluster-mode, fine-grained and coarse-grained). Things look good . iulian On Wed, Dec 23, 2015 at 2:35 PM, Sean Owen wrote: > Docker integration

Re: Update to Spar Mesos docs possibly? LIBPROCESS_IP needs to be set for client mode

2015-12-16 Thread Iulian Dragoș
8.56.50 > > 3. Set correct hostname/ip in mesos configuration - see Nikolaos answer > > > > Cheers, > Aaron > > On Wed, Dec 16, 2015 at 11:00 AM, Iulian Dragoș > <iulian.dra...@typesafe.com> wrote: > > Hi Aaron, > > > > I never had to use that va

Re: Update to Spar Mesos docs possibly? LIBPROCESS_IP needs to be set for client mode

2015-12-16 Thread Iulian Dragoș
> >> Possible solutions - on slave node with public IP 192.168.56.50 > >> > >> 1. Set > >> > >>export LIBPROCESS_IP=192.168.56.50 > >>export SPARK_LOCAL_IP=192.168.56.50 > >> > >> 2. Ensure your hostname resol

Re: Update to Spar Mesos docs possibly? LIBPROCESS_IP needs to be set for client mode

2015-12-16 Thread Iulian Dragoș
Hi Aaron, I never had to use that variable. What is it for? On Wed, Dec 16, 2015 at 2:00 PM, Aaron wrote: > In going through running various Spark jobs, both Spark 1.5.2 and the > new Spark 1.6 SNAPSHOTs, on a Mesos cluster (currently 0.25), we > noticed that is in order

Re: [VOTE] Release Apache Spark 1.6.0 (RC2)

2015-12-15 Thread Iulian Dragoș
Thanks for the heads up. On Tue, Dec 15, 2015 at 11:40 PM, Michael Armbrust wrote: > This vote is canceled due to the issue with the incorrect version. This > issue will be fixed by https://github.com/apache/spark/pull/10317 > > We can wait a little bit for a fix to >

Re: [VOTE] Release Apache Spark 1.6.0 (RC2)

2015-12-15 Thread Iulian Dragoș
-1 (non-binding) Cluster mode on Mesos is broken (regression compared to 1.5.2). It seems to be related to the way SPARK_HOME is handled. In the driver logs I see: I1215 15:00:39.411212 28032 exec.cpp:134] Version: 0.25.0 I1215 15:00:39.413512 28037 exec.cpp:208] Executor registered on slave

Re: How to debug Spark source using IntelliJ/ Eclipse

2015-12-07 Thread Iulian Dragoș
What errors do you see? I’m using Eclipse and things work pretty much as described (I’m using Scala 2.11 so there’s a slight difference for that, but if you’re fine using Scala 2.10 it should be good to go). One little difference: the sbt command is no longer in the sbt directory, instead run:

Re: Removing the Mesos fine-grained mode

2015-11-23 Thread Iulian Dragoș
On Sat, Nov 21, 2015 at 3:37 AM, Adam McElwee wrote: > I've used fine-grained mode on our mesos spark clusters until this week, > mostly because it was the default. I started trying coarse-grained because > of the recent chatter on the mailing list about wanting to move the

Re: Removing the Mesos fine-grained mode

2015-11-20 Thread Iulian Dragoș
ain mode operates in terms of the way one would >> define a Mesos framework. That said, with dyn-allocation and Mesos support >> for both resource reservation, oversubscription and revocation, I think the >> direction is clear that the coarse mode is the proper way forward, and &

Removing the Mesos fine-grained mode

2015-11-19 Thread Iulian Dragoș
Hi all, Mesos is the only cluster manager that has a fine-grained mode, but it's more often than not problematic, and it's a maintenance burden. I'd like to suggest removing it in the 2.0 release. A few reasons: - code/maintenance complexity. The two modes duplicate a lot of functionality (and

Re: Mesos cluster dispatcher doesn't respect most args from the submit req

2015-11-17 Thread Iulian Dragoș
Hi Jo, I agree that there's something fishy with the cluster dispatcher, I've seen some issues like that. I think it actually tries to send all properties as part of `SPARK_EXECUTOR_OPTS`, which may not be everything that's needed:

Building Spark w/ 1.8 and binary incompatibilities

2015-10-19 Thread Iulian Dragoș
Hey all, tl;dr; I built Spark with Java 1.8 even though my JAVA_HOME pointed to 1.7. Then it failed with binary incompatibilities. I couldn’t find any mention of this in the docs, so It might be a known thing, but it’s definitely too easy to do the wrong thing. The problem is that Maven is

Re: Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-12 Thread Iulian Dragoș
On Fri, Oct 9, 2015 at 10:34 PM, Patrick Wendell wrote: > I would push back slightly. The reason we have the PR builds taking so > long is death by a million small things that we add. Doing a full 2.11 > compile is order minutes... it's a nontrivial increase to the build

Re: Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-09 Thread Iulian Dragoș
O] BUILD SUCCESS >> [INFO] >> >> [INFO] Total time: 17:49 min >> >> FYI >> >> On Thu, Oct 8, 2015 at 6:50 AM, Ted Yu <yuzhih...@gmail.com> wrote: >> >>> Interesting

Scala 2.11 builds broken/ Can the PR build run also 2.11?

2015-10-08 Thread Iulian Dragoș
Since Oct. 4 the build fails on 2.11 with the dreaded [error] /home/ubuntu/workspace/Apache Spark (master) on 2.11/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:310: no valid targets for annotation on value conf - it is discarded unused. You may specify targets with

Re: Automatically deleting pull request comments left by AmplabJenkins

2015-08-14 Thread Iulian Dragoș
On Fri, Aug 14, 2015 at 4:21 AM, Josh Rosen rosenvi...@gmail.com wrote: Prototype is at https://github.com/databricks/spark-pr-dashboard/pull/59 On Wed, Aug 12, 2015 at 7:51 PM, Josh Rosen rosenvi...@gmail.com wrote: *TL;DR*: would anyone object if I wrote a script to auto-delete pull

Re: non-deprecation compiler warnings are upgraded to build errors now

2015-07-25 Thread Iulian Dragoș
this is the most common one (if not the only one). iulian On Fri, Jul 24, 2015 at 10:24 AM, Iulian Dragoș iulian.dra...@typesafe.com wrote: On Thu, Jul 23, 2015 at 6:08 AM, Reynold Xin r...@databricks.com wrote: Hi all, FYI, we just merged a patch that fails a build if there is a scala compiler

Re: non-deprecation compiler warnings are upgraded to build errors now

2015-07-24 Thread Iulian Dragoș
On Thu, Jul 23, 2015 at 6:08 AM, Reynold Xin r...@databricks.com wrote: Hi all, FYI, we just merged a patch that fails a build if there is a scala compiler warning (if it is not deprecation warning). I’m a bit confused, since I see quite a lot of warnings in semi-legitimate code. For

Re: Spark 1.5.0-SNAPSHOT broken with Scala 2.11

2015-06-29 Thread Iulian Dragoș
On Mon, Jun 29, 2015 at 3:02 AM, Alessandro Baretta alexbare...@gmail.com wrote: I am building the current master branch with Scala 2.11 following these instructions: Building for Scala 2.11 To produce a Spark package compiled with Scala 2.11, use the -Dscala-2.11 property:

Various forks

2015-06-25 Thread Iulian Dragoș
Could someone point the source of the Spark-fork used to build genjavadoc-plugin? Even more important it would be to know the reasoning behind this fork. Ironically, this hinders my attempts at removing another fork, the Spark REPL fork (and the upgrade to Scala 2.11.7). See here

Re: [VOTE] Release Apache Spark 1.4.0 (RC2)

2015-05-26 Thread Iulian Dragoș
I tried 1.4.0-rc2 binaries on a 3-node Mesos cluster, everything seemed to work fine, both spark-shell and spark-submit. Cluster mode deployment also worked. +1 (non-binding) iulian On Tue, May 26, 2015 at 4:44 AM, jameszhouyi yiaz...@gmail.com wrote: Compiled: git clone

Why use lib_managed for the Sbt build?

2015-05-21 Thread Iulian Dragoș
I’m trying to understand why Sbt is configured to pull all libs under lib_managed. - it seems like unnecessary duplication (I will have those libraries under ./m2, via maven anyway) - every time I call make-distribution I lose lib_managed (via mvn clean install) and have to wait to

Re: Problem building master on 2.11

2015-05-19 Thread Iulian Dragoș
There's an open PR to fix it. If you could try it and report back on the PR it'd be great. More likely to get in fast. https://github.com/apache/spark/pull/6260 On Mon, May 18, 2015 at 6:43 PM, Fernando O. fot...@gmail.com wrote: I just noticed I sent this to users instead of dev: --

Re: Intellij Spark Source Compilation

2015-05-11 Thread Iulian Dragoș
Oh, I see. So then try to run one build on the command time firs (or try sbt avro:generate, though I’m not sure it’s enough). I just noticed that I have an additional source folder target/scala-2.10/src_managed/main/compiled_avro for spark-streaming-flume-sink. I guess I built the project once and

Re: Intellij Spark Source Compilation

2015-05-09 Thread Iulian Dragoș
On Sat, May 9, 2015 at 12:29 AM, rtimp dolethebobdol...@gmail.com wrote: Hello, I'm trying to compile the master branch of the spark source (25889d8) in intellij. I followed the instructions in the wiki https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools, namely I

Re: Speeding up Spark build during development

2015-05-05 Thread Iulian Dragoș
I'm probably the only Eclipse user here, but it seems I have the best workflow :) At least for me things work as they should: once I imported projects in the workspace I can build and run/debug tests from the IDE. I only go to sbt when I need to re-create projects or I want to run the full test

Update Wiki Developer instructions

2015-05-04 Thread Iulian Dragoș
I'd like to update the information about using Eclipse to develop on the Spark project found on this page: https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=38572224 I don't see any way to edit this page (I created an account). Since it's a wiki, I assumed it's supposed to be

Re: Update Wiki Developer instructions

2015-05-04 Thread Iulian Dragoș
change if it is significant enough to need discussion. If it's trivial, just post it here and someone can take care of it. On Mon, May 4, 2015 at 2:32 PM, Iulian Dragoș iulian.dra...@typesafe.com wrote: I'd like to update the information about using Eclipse to develop on the Spark project

Re: Unit tests

2015-02-10 Thread Iulian Dragoș
9, 2015 at 5:47:59 AM, Iulian Dragoș ( iulian.dra...@typesafe.com) wrote: Hi Patrick, Thanks for the heads up. I was trying to set up our own infrastructure for testing Spark (essentially, running `run-tests` every night) on EC2. I stumbled upon a number of flaky tests, but none of them look

Re: Unit tests

2015-02-09 Thread Iulian Dragoș
Hi Patrick, Thanks for the heads up. I was trying to set up our own infrastructure for testing Spark (essentially, running `run-tests` every night) on EC2. I stumbled upon a number of flaky tests, but none of them look similar to anything in Jira with the flaky-test tag. I wonder if there's