Re: memory issue on standalone master

2014-08-07 Thread maddenpj
It looks like your Java heap space is too low: -Xmx512m. It's only using .5G
of RAM, try bumping this up



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/memory-issue-on-standalone-master-tp11610p11711.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: memory issue on standalone master

2014-08-07 Thread Baoqiang Cao
My problem was that I didn’t know how to add. For what might be worthy, it was 
solved by editing the spark-env.sh.

Thanks anyway!

Baoqiang Cao
Blog: http://baoqiang.org
Email: bqcaom...@gmail.com




On Aug 7, 2014, at 3:27 PM, maddenpj madde...@gmail.com wrote:

 It looks like your Java heap space is too low: -Xmx512m. It's only using .5G
 of RAM, try bumping this up
 
 
 
 --
 View this message in context: 
 http://apache-spark-user-list.1001560.n3.nabble.com/memory-issue-on-standalone-master-tp11610p11711.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.
 
 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org
 



memory issue on standalone master

2014-08-06 Thread BQ
Hi There,

I'm starting using spark and got a rookie problem. I used the standalone and
master only, and here is what I did:

./sbin/start-master.sh 

./bin/pyspark

When I tried the example of wordcount.py, which my input file is a bit big,
about  I got the out of memory error, which I excerpted and pasted in below.
I have 60G RAM, in my log file, I found this:

Spark Command: java -cp
::/home/ubuntu/spark/conf:/home/ubuntu/spark/assembly/target/scala-2.10/spark-assembly-1.1.0-SNAPSHOT-hadoop1.0.4.jar
-XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m
org.apache.spark.deploy.master.Master --ip ip-10-123-146-183 --port 7077
--webui-port 8080

Any help please?

..
14/08/07 02:47:06 INFO PythonRDD: Times: total = 7008, boot = 10, init =
106, finish = 6892
14/08/07 02:47:06 ERROR Executor: Exception in task 2.0 in stage 0.0 (TID
122)
java.lang.OutOfMemoryError: Java heap space
at com.esotericsoftware.kryo.io.Output.require(Output.java:142)
at com.esotericsoftware.kryo.io.Output.writeBytes(Output.java:220)
at com.esotericsoftware.kryo.io.Output.writeBytes(Output.java:206)
at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ByteArraySerializer.write(DefaultArraySerializers.java:29)
at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ByteArraySerializer.write(DefaultArraySerializers.java:18)
at com.esotericsoftware.kryo.Kryo.writeObjectOrNull(Kryo.java:549)
at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:312)
at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:293)
at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:568)
at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:148)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:209)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
14/08/07 02:47:06 ERROR Executor: Exception in task 7.0 in stage 0.0 (TID
127)
java.lang.OutOfMemoryError: Java heap space
at com.esotericsoftware.kryo.io.Output.require(Output.java:142)
at com.esotericsoftware.kryo.io.Output.writeBytes(Output.java:220)
at com.esotericsoftware.kryo.io.Output.writeBytes(Output.java:206)
at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ByteArraySerializer.write(DefaultArraySerializers.java:29)
at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ByteArraySerializer.write(DefaultArraySerializers.java:18)
at com.esotericsoftware.kryo.Kryo.writeObjectOrNull(Kryo.java:549)
at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:312)
at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:293)
at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:568)
at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:148)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:209)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/memory-issue-on-standalone-master-tp11610.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org