Re: Spark submit OutOfMemory Error in local mode

2017-08-29 Thread muthu
Are you getting OutOfMemory on the driver or on the executor? Typical cause of OOM in Spark can be due to fewer number of tasks for a job. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-OutOfMemory-Error-in-local-mode-tp29081p29117.html Sent

Re: Spark submit OutOfMemory Error in local mode

2017-08-22 Thread Naga G
t;> Am 22.08.2017 um 20:16 schrieb shitijkuls <kulshreshth...@gmail.com>: >> >> Any help here will be appreciated. >> >> >> >> -- >> View this message in context: >> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-OutOfMemor

Re: Spark submit OutOfMemory Error in local mode

2017-08-22 Thread u...@moosheimer.com
freundlichen Grüßen / best regards Kay-Uwe Moosheimer > Am 22.08.2017 um 20:16 schrieb shitijkuls <kulshreshth...@gmail.com>: > > Any help here will be appreciated. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-

Re: Spark submit OutOfMemory Error in local mode

2017-08-22 Thread shitijkuls
Any help here will be appreciated. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-OutOfMemory-Error-in-local-mode-tp29081p29096.html Sent from the Apache Spark User List mailing list archive at Nabble.com

OutOfMemory error with Spark ML 1.5 logreg example

2015-09-07 Thread Zoltán Tóth
Hi, When I execute the Spark ML Logisitc Regression example in pyspark I run into an OutOfMemory exception. I'm wondering if any of you experienced the same or has a hint about how to fix this. The interesting bit is that I only get the exception when I try to write the result DataFrame into a

Re: OutOfMemory error with Spark ML 1.5 logreg example

2015-09-07 Thread Zoltán Tóth
Aaand, the error! :) Exception in thread "org.apache.hadoop.hdfs.PeerCache@4e000abf" Exception: java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in thread "org.apache.hadoop.hdfs.PeerCache@4e000abf" Exception in thread "Thread-7" Exception: java.lang.OutOfMemoryError thrown

Re: OutOfMemory error with Spark ML 1.5 logreg example

2015-09-07 Thread Zoltán Zvara
Hey, I'd try to debug, profile ResolvedDataSource. As far as I know, your write will be performed by the JVM. On Mon, Sep 7, 2015 at 4:11 PM Tóth Zoltán wrote: > Unfortunately I'm getting the same error: > The other interesting things are that: > - the parquet files got

Re: OutOfMemory error with Spark ML 1.5 logreg example

2015-09-07 Thread boci
Hi, Can you try to using save method instead of write? ex: out_df.save("path","parquet") b0c1 -- Skype: boci13, Hangout: boci.b...@gmail.com On Mon, Sep 7, 2015 at

Re: OutOfMemory error with Spark ML 1.5 logreg example

2015-09-07 Thread Tóth Zoltán
Unfortunately I'm getting the same error: The other interesting things are that: - the parquet files got actually written to HDFS (also with .write.parquet() ) - the application gets stuck in the RUNNING state for good even after the error is thrown 15/09/07 10:01:10 INFO spark.ContextCleaner:

Re: OutOfMemory error with Spark ML 1.5 logreg example

2015-09-07 Thread Zsolt Tóth
Hi, I ran your example on Spark-1.4.1 and 1.5.0-rc3. It succeeds on 1.4.1 but throws the OOM on 1.5.0. Do any of you know which PR introduced this issue? Zsolt 2015-09-07 16:33 GMT+02:00 Zoltán Zvara : > Hey, I'd try to debug, profile ResolvedDataSource. As far as I

Re: Spark Streaming updatyeStateByKey throws OutOfMemory Error

2015-04-23 Thread Sourav Chandra
updateStateBykey based on the received message type and finally stores into redis. After running for few seconds the executor process get killed by throwing OutOfMemory error. The code snippet is below: *NoOfReceiverInstances = 1* *val kafkaStreams = (1 to NoOfReceiverInstances).map

Re: Spark Streaming updatyeStateByKey throws OutOfMemory Error

2015-04-23 Thread Sourav Chandra
on the received message type and finally stores into redis. After running for few seconds the executor process get killed by throwing OutOfMemory error. The code snippet is below: *NoOfReceiverInstances = 1* *val kafkaStreams = (1 to NoOfReceiverInstances).map

Re: Spark Streaming updatyeStateByKey throws OutOfMemory Error

2015-04-22 Thread Tathagata Das
the executor process get killed by throwing OutOfMemory error. The code snippet is below: *NoOfReceiverInstances = 1* *val kafkaStreams = (1 to NoOfReceiverInstances).map(* * _ = KafkaUtils.createStream(ssc, ZKQuorum, ConsumerGroup, TopicsMap)* *)* *val updateFunc = (values: Seq

Re: Spark Streaming updatyeStateByKey throws OutOfMemory Error

2015-04-22 Thread Sourav Chandra
and finally stores into redis. After running for few seconds the executor process get killed by throwing OutOfMemory error. The code snippet is below: *NoOfReceiverInstances = 1* *val kafkaStreams = (1 to NoOfReceiverInstances).map(* * _ = KafkaUtils.createStream(ssc, ZKQuorum, ConsumerGroup

Spark Streaming updatyeStateByKey throws OutOfMemory Error

2015-04-21 Thread Sourav Chandra
Hi, We are building a spark streaming application which reads from kafka, does updateStateBykey based on the received message type and finally stores into redis. After running for few seconds the executor process get killed by throwing OutOfMemory error. The code snippet is below

Re: Spark Streaming updatyeStateByKey throws OutOfMemory Error

2015-04-21 Thread Olivier Girardot
and finally stores into redis. After running for few seconds the executor process get killed by throwing OutOfMemory error. The code snippet is below: *NoOfReceiverInstances = 1* *val kafkaStreams = (1 to NoOfReceiverInstances).map(* * _ = KafkaUtils.createStream(ssc, ZKQuorum

OutOfMemory error in Spark Core

2015-01-15 Thread Anand Mohan
the app crashes on a lost executor which itself failed due to a OutOfMemory error as below. This looks almost identical to https://issues.apache.org/jira/browse/SPARK-4885 even though we are seeing this error in Spark 1.1 2015-01-15 20:12:51,653 [handle-message-executor-13] ERROR

Re: OutOfMemory error in Spark Core

2015-01-15 Thread Akhil Das
(including partitioning in reduceByKey) and 4. joining a couple of MySQL tables using JdbcRdd Of late, we are seeing major instabilities where the app crashes on a lost executor which itself failed due to a OutOfMemory error as below. This looks almost identical to https://issues.apache.org

Re: OutOfMemory Error

2014-08-20 Thread MEETHU MATHEW
machine learning algorithms on Spark. I am working on a 3 node cluster, with each node having 5GB of memory. Whenever I am working with slightly more number of records, I end up with OutOfMemory Error. Problem is, even if number of records is slightly high, the intermediate result from a transformation

RE: OutOfMemory Error

2014-08-20 Thread Shao, Saisai
/configuration.html Thanks Jerry From: MEETHU MATHEW [mailto:meethu2...@yahoo.co.in] Sent: Wednesday, August 20, 2014 4:48 PM To: Akhil Das; Ghousia Cc: user@spark.apache.org Subject: Re: OutOfMemory Error Hi , How to increase the heap size? What is the difference between spark executor memory and heap

Re: OutOfMemory Error

2014-08-19 Thread Ghousia
to a new huge value, resulting in OutOfMemory Error. On Mon, Aug 18, 2014 at 12:34 PM, Akhil Das ak...@sigmoidanalytics.com wrote: I believe spark.shuffle.memoryFraction is the one you are looking for. spark.shuffle.memoryFraction : Fraction of Java heap to use for aggregation and cogroups

Re: OutOfMemory Error

2014-08-18 Thread Akhil Das
node cluster, with each node having 5GB of memory. Whenever I am working with slightly more number of records, I end up with OutOfMemory Error. Problem is, even if number of records is slightly high, the intermediate result from a transformation is huge and this results in OutOfMemory Error

Re: OutOfMemory Error

2014-08-18 Thread Ghousia
to implement machine learning algorithms on Spark. I am working on a 3 node cluster, with each node having 5GB of memory. Whenever I am working with slightly more number of records, I end up with OutOfMemory Error. Problem is, even if number of records is slightly high, the intermediate result

Re: OutOfMemory Error

2014-08-18 Thread Akhil Das
trying to implement machine learning algorithms on Spark. I am working on a 3 node cluster, with each node having 5GB of memory. Whenever I am working with slightly more number of records, I end up with OutOfMemory Error. Problem is, even if number of records is slightly high, the intermediate

OutOfMemory Error

2014-08-17 Thread Ghousia Taj
Hi, I am trying to implement machine learning algorithms on Spark. I am working on a 3 node cluster, with each node having 5GB of memory. Whenever I am working with slightly more number of records, I end up with OutOfMemory Error. Problem is, even if number of records is slightly high