Hi every body,
      I am trying to run a pyspark job. After running many days later I am 
seeing the following failures:
  resultstage 46047 has failed the maximum allowable number of times:4.Most 
recent failure reason: org.apache.spark.shuffle.FetchFailedException: Failure 
while fetching StreamChunkId{streamId=1657105713045, chunkIndex=0}: 
 java.lang.RuntimeException: Executor is not registered 
(appId=application_1637073699733_1081, execId=2825)
I google it and do not find any effective msg.
Do you have any suggestions?
Best regards.
    

Reply via email to