Hi Ankur
If you using standalone mode, your config is wrong. You should use export
SPARK_DAEMON_MEMORY=xxx in config/spark-env.sh. At least it works on my
spark 1.3.0 standalone mode machine.
BTW, The SPARK_DRIVER_MEMORY is used in Yarn mode and looks like the
standalone mode don't use this
Great work!
On May 30, 2014 10:15 PM, Ian Ferreira ianferre...@hotmail.com wrote:
Congrats
Sent from my Windows Phone
--
From: Dean Wampler deanwamp...@gmail.com
Sent: 5/30/2014 6:53 AM
To: user@spark.apache.org
Subject: Re: Announcing Spark 1.0.0
Hi Sandeep
I think you should use testRatings.mapToPair instead of testRatings.map.
So the code should be
JavaPairRDDInteger,Integer usersProducts = training.mapToPair(
new PairFunctionRating, Integer, Integer() {
public Tuple2Integer, Integer
Hi Prasad
Sorry for missing your reply.
https://gist.github.com/thegiive/10791823
Here it is.
Wisely Chen
On Fri, Apr 4, 2014 at 11:57 PM, Prasad ramachandran.pra...@gmail.comwrote:
Hi Wisely,
Could you please post your pom.xml here.
Thanks
--
View this message in context:
Hi Praveen
What is your config about * spark.local.dir ? *
Is all your worker has this dir and all worker has right permission on this
dir?
I think this is the reason of your error
Wisely Chen
On Mon, Apr 14, 2014 at 9:29 PM, Praveen R prav...@sigmoidanalytics.comwrote:
Had below error