Re: Out of memory issue

2020-11-20 Thread Russell Spitzer
Well if the system doesn't change, then the data must be different. The exact exception probably won't be helpful since it only tells us the last allocation that failed. My guess is that your ingestion changed and there is either now slightly more data than previously or it's skewed differently.

Re: Out of memory issue

2020-11-20 Thread Amit Sharma
please help. Thanks Amit On Mon, Nov 9, 2020 at 4:18 PM Amit Sharma wrote: > Please find below the exact exception > > Exception in thread "streaming-job-executor-3" java.lang.OutOfMemoryError: > Java heap space > at java.util.Arrays.copyOf(Arrays.java:3332) > at >

Re: Out of memory issue

2020-11-09 Thread Amit Sharma
Please find below the exact exception Exception in thread "streaming-job-executor-3" java.lang.OutOfMemoryError: Java heap space at java.util.Arrays.copyOf(Arrays.java:3332) at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124) at

Re: Out of memory issue

2020-11-09 Thread Amit Sharma
Can you please help. Thanks Amit On Sun, Nov 8, 2020 at 1:35 PM Amit Sharma wrote: > Hi , I am using 16 nodes spark cluster with below config > 1. Executor memory 8 GB > 2. 5 cores per executor > 3. Driver memory 12 GB. > > > We have streaming job. We do not see problem but sometimes we get

Out of memory issue

2020-11-08 Thread Amit Sharma
Hi , I am using 16 nodes spark cluster with below config 1. Executor memory 8 GB 2. 5 cores per executor 3. Driver memory 12 GB. We have streaming job. We do not see problem but sometimes we get exception executor-1 heap memory issue. I am not understanding if data size is same and this job

Re: Kafka spark structure streaming out of memory issue

2020-08-13 Thread Srinivas V
It depends on how much memory is available and how much data you are processing. Please provide data size and cluster details to help. On Fri, Aug 14, 2020 at 12:54 AM km.santanu wrote: > Hi > I am using Kafka stateless structure streaming.i have enabled watermark as > 1 > hour.after long

Kafka spark structure streaming out of memory issue

2020-08-13 Thread km.santanu
Hi I am using Kafka stateless structure streaming.i have enabled watermark as 1 hour.after long running about 2 hour my job is terminating automatically.check point has been enabled. I am doing average on input data. Can you please suggest how to avoid out of memory error -- Sent from:

RE: Out Of Memory issue

2016-10-31 Thread Kürşat Kurt
Any idea about this? From: Kürşat Kurt [mailto:kur...@kursatkurt.com] Sent: Sunday, October 30, 2016 7:59 AM To: 'Jörn Franke' <jornfra...@gmail.com> Cc: 'user@spark.apache.org' <user@spark.apache.org> Subject: RE: Out Of Memory issue Hi Jörn; I am reading 300.000 l

Out Of Memory issue

2016-10-29 Thread Kürşat Kurt
Hi; While training NaiveBayes classification, i am getting OOM. What is wrong with these parameters? Here is the spark-submit command: ./spark-submit --class main.scala.Test1 --master local[*] --driver-memory 60g /home/user1/project_2.11-1.0.jar Ps: Os is Ubuntu 14.04 and system has

RE: Out of memory issue

2016-01-06 Thread Ewan Leith
Message- From: babloo80 [mailto:bablo...@gmail.com] Sent: 06 January 2016 03:44 To: user@spark.apache.org Subject: Out of memory issue Hello there, I have a spark job reads 7 parquet files (8 GB, 3 x 16 GB, 3 x 14 GB) in different stages of execution and creates a result parquet of 9 GB

Re: Out of memory issue

2016-01-06 Thread Muthu Jayakumar
core-site.xml). > > It's definitely worth setting spark.memory.fraction and > parquet.memory.pool.ratio and trying again. > > Ewan > > -Original Message- > From: babloo80 [mailto:bablo...@gmail.com] > Sent: 06 January 2016 03:44 > To: user@spark.apache.org > Subject: Out o

Out of memory issue

2016-01-05 Thread babloo80
-- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Out-of-memory-issue-tp25888.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user