This looks like Spark application is running into a abnormal status. From
the stack it means driver could not send requests to AM, can you please
check if AM is reachable or are there any other exceptions beside this one.
>From my past test, Spark's dynamic allocation may run into some corner
Hello all,
I am running hadoop 2.6.4 with Spark 2.0 and I have been trying to get dynamic
allocation to work without success. I was able to get it to work with Spark
16.1 however.
When I issue the commandspark-shell --master yarn --deploy-mode client
this is the error I see:
16/08/24 00:05:40