Re: Understanding Spark Memory distribution

2015-03-30 Thread giive chen
Hi Ankur If you using standalone mode, your config is wrong. You should use export SPARK_DAEMON_MEMORY=xxx in config/spark-env.sh. At least it works on my spark 1.3.0 standalone mode machine. BTW, The SPARK_DRIVER_MEMORY is used in Yarn mode and looks like the standalone mode don't use this

RE: Announcing Spark 1.0.0

2014-05-30 Thread giive chen
Great work! On May 30, 2014 10:15 PM, Ian Ferreira ianferre...@hotmail.com wrote: Congrats Sent from my Windows Phone -- From: Dean Wampler deanwamp...@gmail.com Sent: 5/30/2014 6:53 AM To: user@spark.apache.org Subject: Re: Announcing Spark 1.0.0

Re: Java RDD structure for Matrix predict?

2014-05-27 Thread giive chen
Hi Sandeep I think you should use testRatings.mapToPair instead of testRatings.map. So the code should be JavaPairRDDInteger,Integer usersProducts = training.mapToPair( new PairFunctionRating, Integer, Integer() { public Tuple2Integer, Integer

Re: Error reading HDFS file using spark 0.9.0 / hadoop 2.2.0 - incompatible protobuf 2.5 and 2.4.1

2014-04-15 Thread giive chen
Hi Prasad Sorry for missing your reply. https://gist.github.com/thegiive/10791823 Here it is. Wisely Chen On Fri, Apr 4, 2014 at 11:57 PM, Prasad ramachandran.pra...@gmail.comwrote: Hi Wisely, Could you please post your pom.xml here. Thanks -- View this message in context:

Re: Lost an executor error - Jobs fail

2014-04-14 Thread giive chen
Hi Praveen What is your config about * spark.local.dir ? * Is all your worker has this dir and all worker has right permission on this dir? I think this is the reason of your error Wisely Chen On Mon, Apr 14, 2014 at 9:29 PM, Praveen R prav...@sigmoidanalytics.comwrote: Had below error