Hi,
I have started my spark exploration in intellij IDEA local model and want to
focus on MLlib part.
but when I put some example codes in IDEA, It can not recognize mllib package,
just loos like that:
import org.apache.spark.SparkContext
import org.apache.spark.mllib.recommendation.ALS
Sorry the color is missing. the mllib is red word and import sentence is
grey.import org.apache.spark.mllib.recommendation.ALS
At 2014-08-03 05:03:31, jun kit...@126.com wrote: Hi, I have started
my spark exploration in intellij IDEA local model and want to focus on MLlib
part. but when I
You missed the mllib artifact? that would certainly explain it! all I
see is core.
On Sun, Aug 3, 2014 at 10:03 AM, jun kit...@126.com wrote:
Hi,
I have started my spark exploration in intellij IDEA local model and want to
focus on MLlib part.
but when I put some example codes in IDEA, It
On Mon, Jul 21, 2014 at 4:44 PM, Kay Ousterhout k...@eecs.berkeley.edu
wrote:
This also happens when something accidentally gets merged after the tests
have started but before tests have passed.
Some improvements to SparkQA https://github.com/SparkQA could help with
this. May I suggest:
The Contributing to Spark guide on the Spark Wiki provides a good overview on
how to start contributing:
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark
On August 3, 2014 at 5:14:23 PM, pritish (prit...@nirvana-international.com)
wrote:
Hi
We would like to
1. Include the commit hash in the tests have started/completed
messages, so that it's clear what code exactly is/has been tested for
each
test cycle.
Great idea - I think this is easy to do given the current architecture. We
already have access to the commit ID in the same script
On the latest pull today (6ba6c3ebfe9a47351a50e45271e241140b09bf10) meet
assembly problem.
$ ./sbt/sbt assembly
Using /usr/lib/jvm/java-7-oracle as default JAVA_HOME.
Note, this will be overridden by -java-home if it is set.
[info] Loading project definition from ~/spark/project/project
[info]
I'll let TD chime on on this one, but I'm guessing this would be a welcome
addition. It's great to see community effort on adding new
streams/receivers, adding a Java API for receivers was something we did
specifically to allow this :)
- Patrick
On Sat, Aug 2, 2014 at 10:09 AM, Dibyendu
Hey Anand,
Thanks for looking into this - it's great to see momentum towards Scala
2.11 and I'd love if this land in Spark 1.2.
For the external dependencies, it would be good to create a sub-task of
SPARK-1812 to track our efforts encouraging other projects to upgrade. In
certain cases (e.g.
On Sun, Aug 3, 2014 at 11:29 PM, Patrick Wendell pwend...@gmail.com wrote:
Nick - Any interest in doing these? this is all doable from within the
spark repo itself because our QA harness scripts are in there:
https://github.com/apache/spark/blob/master/dev/run-tests-jenkins
If not, could you
11 matches
Mail list logo