Dear fellow Spark users, I have a multithreaded Java program which launches multiple Spark jobs in parallel through *SparkLauncher* API. It also monitors these Spark jobs and keeps on updating information like job start/end time, current state, tracking url etc in an audit table. To get these information I am making use of *YarnClient.getApplicationReport(ApplicationId)* API. The ApplicationId used here is retrieved through *SparkAppHandle.getAppId()* However, I have noticed that sometimes my program fails saying :
*org.apache.hadoop.yarn.exceptions.ApplicationNotFoundException: Application with id 'application_1490281320520_107048' doesn't exist in RM.* This is an intermittent problem and most of the times it runs successfully. But when it fails, it fails twice/thrice successively. After re-running the program repeatedly it returns to normal. Has anyone else faced this issue? Any help is greatly appreciated. Thank you so much for your valuable time! [image: http://] Tariq, Mohammad about.me/mti [image: http://] <http://about.me/mti> [image: --] Tariq, Mohammad [image: https://]about.me/mti <https://about.me/mti?promo=email_sig&utm_source=product&utm_medium=email_sig&utm_campaign=chrome_ext>