Hi

My Spark application was crashed and show information

LogType:stdout
Log Upload Time:Wed Jun 29 14:38:03 -0700 2016
LogLength:1096
Log Contents:
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGILL (0x4) at pc=0x00007f67baa0d221, pid=12207, tid=140083473176320
#
# JRE version: Java(TM) SE Runtime Environment (7.0_67-b01) (build
1.7.0_67-b01)
# Java VM: Java HotSpot(TM) 64-Bit Server VM (24.65-b04 mixed mode
linux-amd64 compressed oops)
# Problematic frame:
# C  [libcaffe.so.1.0.0-rc3+0x786221]  sgemm_kernel+0x21
#
# Failed to write core dump. Core dumps have been disabled. To enable core
dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
#
/yarn/nm/usercache/ubuntu/appcache/application_1467236060045_0001/container_1467236060045_0001_01_000003/hs_err_pid12207.log



but I am not able to found 
"/yarn/nm/usercache/ubuntu/appcache/application_1467236060045_0001/container_1467236060045_0001_01_000003/hs_err_pid12207.log"
 
file . its deleted  automatically after Spark application
 finished


how  to retain report file , i am running spark with yarn .

Regards
Prateek



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Error-report-file-is-deleted-automatically-after-spark-application-finished-tp27247.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to