Hello,
My application is launched by SparkLauncher and I am trying to collect all logs
(from launcher, application, and Spark core(?)) into one log file.
I can see all those from the console (using Eclipse IDE) but can’t make them to
be written in the log file. With the (simplified) code under,
Hi Nimmi,
Can you send us the spark parameters with overhead. assuming you are
running with yarn
Example
[4] - 864GB
--num-executors 32
--executor-memory 21G
--executor-cores 4
--conf spark.yarn.executor.memoryOverhead=3000
The parameter spark.yarn.executor.memoryOverhead is explained as
0
I get the following error on executors while running my spark job. I am
reading data from Database. The data has string in UTF8
Iterator t.next().getString(row.fieldIndex("short_name"));
ERROR org.apache.spark.util.SparkUncaughtExceptionHandler - Uncaught
exception in thread Thread[Executor
Hi , we have spark streaming job to which we send a request through our UI
using kafka. It process and returned the response. We are getting below
error and this stareming is not processing any request.
Listener StreamingJobProgressListener threw an exception
java.util.NoSuchElementException: key