Re: how to set spark.executor.memory and heap size

2014-07-07 Thread Alex Gaudio
Hi All, This is a bit late, but I found it helpful. Piggy-backing on Wang Hao's comment, spark will ignore the spark.executor.memory setting if you add it to SparkConf via: conf.set(spark.executor.memory, 1g) What you actually should do depends on how you run spark. I found some official

Re: how to set spark.executor.memory and heap size

2014-06-13 Thread Hao Wang
Hi, Laurent You could set Spark.executor.memory and heap size by following methods: 1. in you conf/spark-env.sh: *export SPARK_WORKER_MEMORY=38g* *export SPARK_JAVA_OPTS=-XX:-UseGCOverheadLimit -XX:+UseConcMarkSweepGC -Xmx2g -XX:MaxPermSize=256m* 2. you could also add modification for

Re: how to set spark.executor.memory and heap size

2014-06-12 Thread Laurent T
Hi, Can you give us a little more insight on how you used that file to solve your problem ? We're having the same OOM as you were and haven't been able to solve it yet. Thanks -- View this message in context:

Re: how to set spark.executor.memory and heap size

2014-04-24 Thread wxhsdp
thank you, i add setJars, but nothing changes val conf = new SparkConf() .setMaster(spark://127.0.0.1:7077) .setAppName(Simple App) .set(spark.executor.memory, 1g) .setJars(Seq(target/scala-2.10/simple-project_2.10-1.0.jar)) val sc = new SparkContext(conf) --

Re: Re: how to set spark.executor.memory and heap size

2014-04-24 Thread qinwei
try the complete path qinwei  From: wxhsdpDate: 2014-04-24 14:21To: userSubject: Re: how to set spark.executor.memory and heap sizethank you, i add setJars, but nothing changes       val conf = new SparkConf()   .setMaster(spark://127.0.0.1:7077)   .setAppName(Simple App)  

Re: Re: how to set spark.executor.memory and heap size

2014-04-24 Thread wxhsdp
i tried, but no effect Qin Wei wrote try the complete path qinwei  From: wxhsdpDate: 2014-04-24 14:21To: userSubject: Re: how to set spark.executor.memory and heap sizethank you, i add setJars, but nothing changes       val conf = new SparkConf()  

Re: how to set spark.executor.memory and heap size

2014-04-24 Thread wxhsdp
i think maybe it's the problem of read local file val logFile = /home/wxhsdp/spark/example/standalone/README.md val logData = sc.textFile(logFile).cache() if i replace the above code with val logData = sc.parallelize(Array(1,2,3,4)).cache() the job can complete successfully can't i read a

Re: how to set spark.executor.memory and heap size

2014-04-24 Thread Adnan Yaqoob
You need to use proper url format: file://home/wxhsdp/spark/example/standalone/README.md On Thu, Apr 24, 2014 at 1:29 PM, wxhsdp wxh...@gmail.com wrote: i think maybe it's the problem of read local file val logFile = /home/wxhsdp/spark/example/standalone/README.md val logData =

Re: how to set spark.executor.memory and heap size

2014-04-24 Thread Adnan Yaqoob
Sorry wrong format: file:///home/wxhsdp/spark/example/standalone/README.md An extra / is needed at the start. On Thu, Apr 24, 2014 at 1:46 PM, Adnan Yaqoob nsyaq...@gmail.com wrote: You need to use proper url format: file://home/wxhsdp/spark/example/standalone/README.md On Thu, Apr 24,

Re: how to set spark.executor.memory and heap size

2014-04-24 Thread wxhsdp
thanks for your reply, adnan, i tried val logFile = file:///home/wxhsdp/spark/example/standalone/README.md i think there needs three left slash behind file: it's just the same as val logFile = home/wxhsdp/spark/example/standalone/README.md the error remains:( -- View this message in context:

Re: how to set spark.executor.memory and heap size

2014-04-24 Thread Arpit Tak
Hi, You should be able to read it, file://or file:/// not even required for reading locally , just path is enough.. what error message you getting on spark-shell while reading... for local: Also read the same from hdfs file also ... put your README file there and read , it works in both ways..

Re: how to set spark.executor.memory and heap size

2014-04-24 Thread wxhsdp
hi arpit, on spark shell, i can read local file properly, but when i use sbt run, error occurs. the sbt error message is in the beginning of the thread Arpit Tak-2 wrote Hi, You should be able to read it, file://or file:/// not even required for reading locally , just path is enough..

Re: how to set spark.executor.memory and heap size

2014-04-24 Thread Arpit Tak
Okk fine, try like this , i tried and it works.. specify spark path also in constructor... and also export SPARK_JAVA_OPTS=-Xms300m -Xmx512m -XX:MaxPermSize=1g import org.apache.spark.SparkContext import org.apache.spark.SparkContext._ object SimpleApp { def main(args:

Re: how to set spark.executor.memory and heap size

2014-04-24 Thread wxhsdp
it seems that it's nothing about settings, i tried take action, and find it's ok, but error occurs when i tried count and collect val a = sc.textFile(any file) a.take(n).foreach(println) //ok a.count() //failed a.collect()//failed val b = sc.parallelize((Array(1,2,3,4))

Re: how to set spark.executor.memory and heap size

2014-04-24 Thread wxhsdp
anyone knows the reason? i've googled a bit, and found some guys had the same problem, but with no replies... -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/how-to-set-spark-executor-memory-and-heap-size-tp4719p4796.html Sent from the Apache Spark User

Re: how to set spark.executor.memory and heap size

2014-04-24 Thread wxhsdp
i noticed that error occurs at org.apache.hadoop.io.WritableUtils.readCompressedStringArray(WritableUtils.java:183) at org.apache.hadoop.conf.Configuration.readFields(Configuration.java:2378) at

Re: how to set spark.executor.memory and heap size

2014-04-24 Thread YouPeng Yang
Hi I am also curious about this question. The textFile function was supposed to read a hdfs file? In this case ,It is on local filesystem that the file was taken in.There are any recognization ways to identify the local filesystem and the hdfs in the textFile function? Beside, the OOM

how to set spark.executor.memory and heap size

2014-04-23 Thread wxhsdp
hi i'am testing SimpleApp.scala in standalone mode with only one pc, so i have one master and one local worker on the same pc with rather small input file size(4.5K), i have got the java.lang.OutOfMemoryError: Java heap space error here's my settings: spark-env.sh: export

Re: how to set spark.executor.memory and heap size

2014-04-23 Thread wxhsdp
by the way, codes run ok in spark shell -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/how-to-set-spark-executor-memory-and-heap-size-tp4719p4720.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: how to set spark.executor.memory and heap size

2014-04-23 Thread Adnan Yaqoob
When I was testing spark, I faced this issue, this issue is not related to memory shortage, It is because your configurations are not correct. Try to pass you current Jar to to the SparkContext with SparkConf's setJars function and try again. On Thu, Apr 24, 2014 at 8:38 AM, wxhsdp