Hi Ted,
Any thoughts on this ???
I am getting the same kind of error when I kill a worker on one of the
machines.
Even after killing the worker using kill -9 command, the executor shows up
on the spark UI with negative active tasks.
All the tasks on that worker starts to fail with the following
This is what I am getting in the executor logs
16/03/29 10:49:00 ERROR DiskBlockObjectWriter: Uncaught exception while
reverting partial writes to file
Can you show the stack trace ?
The log message came from
DiskBlockObjectWriter#revertPartialWritesAndClose().
Unfortunately, the method doesn't throw exception, making it a bit hard for
caller to know of the disk full condition.
On Thu, Mar 31, 2016 at 11:32 AM, Abhishek Anand
Hi,
Why is it so that when my disk space is full on one of the workers then the
executor on that worker becomes unresponsive and the jobs on that worker
fails with the exception
16/03/29 10:49:00 ERROR DiskBlockObjectWriter: Uncaught exception while
reverting partial writes to file