Thanks, Josh! Looking forward for your patch! Meanwhile, I've tried to
change it manually and can confirm that it works fine.
On Thu, Nov 28, 2013 at 8:11 PM, Josh Rosen rosenvi...@gmail.com wrote:
This is a bug. The str() is there because I want to convert objects to
strings like Java's
Hi All,
I am trying to do collaborative filtering with MLbase. I am using spark
0.8.0
I have some basic questions.
1) I am using maven and added dependency to my pom
dependency
groupIdorg.apache.spark/groupId
artifactIdspark-core_2.9.3/artifactId
Hi Matei,
Good to hear from you. The stack trace is below. I launched the instances
with --spark-version=0.8.0 and verified that the version was correct by
launching spark-shell. Also verified that the version I've got in my
project is 0.8.0. Nothing else should have changed, as the scripts I
Hi Aslan,
You'll need to link against the spark-mllib artifact. The method we have
currently for collaborative filtering is ALS.
Documentation is available here -
http://spark.incubator.apache.org/docs/latest/mllib-guide.html
We're working on a more complete ALS tutorial, and will link to it
i use a SparkListener to collect info about failures in task related to my
RDD.
to do so for every stage submitted i verify if the stage is for an RDD that
is a dependency of my target target RDD (including the target RDD itself).
then for every task ending i check if the task is for a stage i
I think this might be an issue with the tutorial — try asking the Mesosphere
folks who created it.
Matei
On Nov 28, 2013, at 9:23 PM, om prakash pandey pande...@gmail.com wrote:
Dear Sir/Madam,
I have been trying to run Apache Spark over Mesos and have been following
the below tutorial.
The full context isn't much -- this is the first thing I do in my main
method (assign a value to sc), and it throws this error.
On Fri, Nov 29, 2013 at 10:38 AM, Walrus theCat walrusthe...@gmail.comwrote:
Hi Matei,
Good to hear from you. The stack trace is below. I launched the
instances
I opened a pull request containing a fix and regression test:
https://github.com/apache/incubator-spark/pull/218
On Fri, Nov 29, 2013 at 5:18 AM, Andrei faithlessfri...@gmail.com wrote:
Thanks, Josh! Looking forward for your patch! Meanwhile, I've tried to
change it manually and can confirm
From the configuration page:
'To set a system property for configuring Spark, you need to either pass it
with a -D flag to the JVM (for example java -Dspark.cores.max=5 MyProgram)
or call System.setProperty in your code *before* creating your Spark
context, as follows:'
Since running spark-shell
I am sure you have already checked this, any chance the classpath has
v 0.7.x jars in it?
On Nov 29, 2013 4:40 PM, Walrus theCat walrusthe...@gmail.com wrote:
The full context isn't much -- this is the first thing I do in my main
method (assign a value to sc), and it throws this error.
On
10 matches
Mail list logo