Intellij IDEA can not recognize the MLlib package

2014-08-03 Thread jun
Hi, I have started my spark exploration in intellij IDEA local model and want to focus on MLlib part. but when I put some example codes in IDEA, It can not recognize mllib package, just loos like that: import org.apache.spark.SparkContext import org.apache.spark.mllib.recommendation.ALS

Re:Intellij IDEA can not recognize the MLlib package

2014-08-03 Thread jun
Sorry the color is missing. the mllib is red word and import sentence is grey.import org.apache.spark.mllib.recommendation.ALS At 2014-08-03 05:03:31, jun kit...@126.com wrote: Hi, I have started my spark exploration in intellij IDEA local model and want to focus on MLlib part. but when I

Re: Intellij IDEA can not recognize the MLlib package

2014-08-03 Thread Sean Owen
You missed the mllib artifact? that would certainly explain it! all I see is core. On Sun, Aug 3, 2014 at 10:03 AM, jun kit...@126.com wrote: Hi, I have started my spark exploration in intellij IDEA local model and want to focus on MLlib part. but when I put some example codes in IDEA, It

Re: -1s on pull requests?

2014-08-03 Thread Nicholas Chammas
On Mon, Jul 21, 2014 at 4:44 PM, Kay Ousterhout k...@eecs.berkeley.edu wrote: This also happens when something accidentally gets merged after the tests have started but before tests have passed. Some improvements to SparkQA https://github.com/SparkQA could help with this. May I suggest:

Re: I would like to contribute

2014-08-03 Thread Josh Rosen
The Contributing to Spark guide on the Spark Wiki provides a good overview on how to start contributing: https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark On August 3, 2014 at 5:14:23 PM, pritish (prit...@nirvana-international.com) wrote: Hi We would like to

(send this email to subscribe)

2014-08-03 Thread Gurumurthy Yeleswarapu

Re: -1s on pull requests?

2014-08-03 Thread Patrick Wendell
1. Include the commit hash in the tests have started/completed messages, so that it's clear what code exactly is/has been tested for each test cycle. Great idea - I think this is easy to do given the current architecture. We already have access to the commit ID in the same script

Compiling Spark master (6ba6c3eb) with sbt/sbt assembly

2014-08-03 Thread Larry Xiao
On the latest pull today (6ba6c3ebfe9a47351a50e45271e241140b09bf10) meet assembly problem. $ ./sbt/sbt assembly Using /usr/lib/jvm/java-7-oracle as default JAVA_HOME. Note, this will be overridden by -java-home if it is set. [info] Loading project definition from ~/spark/project/project [info]

Re: Low Level Kafka Consumer for Spark

2014-08-03 Thread Patrick Wendell
I'll let TD chime on on this one, but I'm guessing this would be a welcome addition. It's great to see community effort on adding new streams/receivers, adding a Java API for receivers was something we did specifically to allow this :) - Patrick On Sat, Aug 2, 2014 at 10:09 AM, Dibyendu

Re: Scala 2.11 external dependencies

2014-08-03 Thread Patrick Wendell
Hey Anand, Thanks for looking into this - it's great to see momentum towards Scala 2.11 and I'd love if this land in Spark 1.2. For the external dependencies, it would be good to create a sub-task of SPARK-1812 to track our efforts encouraging other projects to upgrade. In certain cases (e.g.

Re: -1s on pull requests?

2014-08-03 Thread Nicholas Chammas
On Sun, Aug 3, 2014 at 11:29 PM, Patrick Wendell pwend...@gmail.com wrote: Nick - Any interest in doing these? this is all doable from within the spark repo itself because our QA harness scripts are in there: https://github.com/apache/spark/blob/master/dev/run-tests-jenkins If not, could you