Hi, I am using Spark 1.0.2 on a mesos cluster. After I run my job, when I try to look at the detailed application stats using a history server@18080, the stats don't show up for some of the jobs even though the job completed successfully and the event logs are written to the log folder. The log from the history server execution is attached below - looks like it is encountering some parsing error when reading the EVENT_LOG file ( I have not modified this file). Basically the line that says "Malformed line " seems to be truncating the first path (instead of amd64, it shows up as a d64). Does the history server have any String buffer limitations that would be causing this problem? Also, I want to point out that this problem does not happen all the time - during some runs the app details do show up. However this is quite unpredictable.
The same job when I ran using Spark 1.0.1 in standalone mode (i.e without using a history server), showed up on the application details page. I am not sure if this is a problem with the history server or specifically with version 1.0.2. Is it possible to fix this problem, as I would like to use the application details? thanks 14/09/11 20:50:55 ERROR ReplayListenerBus: Exception in parsing Spark event log file:/mapr/applogs_spark_mesos/spark_test-1410468489529/EVENT_LOG_1 com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'd64': was expecting at [Source: java.io.StringReader@2d51a56a; line: 1, column: 4] at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1524) at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:557) at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._reportInvalidToken(ReaderBasedJsonParser.java:2042) at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._handleOddValue(ReaderBasedJsonParser.java:1412) at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken(ReaderBasedJsonParser.java:679) at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:3024) at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:2971) at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2091) at org.json4s.jackson.JsonMethods$class.parse(JsonMethods.scala:19) at org.json4s.jackson.JsonMethods$.parse(JsonMethods.scala:44) at org.apache.spark.scheduler.ReplayListenerBus$$anonfun$replay$2$$anonfun$apply$1.apply(ReplayListenerBus.scala:71) at org.apache.spark.scheduler.ReplayListenerBus$$anonfun$replay$2$$anonfun$apply$1.apply(ReplayListenerBus.scala:69) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at org.apache.spark.scheduler.ReplayListenerBus$$anonfun$replay$2.apply(ReplayListenerBus.scala:69) at org.apache.spark.scheduler.ReplayListenerBus$$anonfun$replay$2.apply(ReplayListenerBus.scala:55) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34) at org.apache.spark.scheduler.ReplayListenerBus.replay(ReplayListenerBus.scala:55) at org.apache.spark.deploy.history.HistoryServer.org$apache$spark$deploy$history$HistoryServer$$renderSparkUI(HistoryServer.scala:182) at org.apache.spark.deploy.history.HistoryServer$$anonfun$checkForLogs$3.apply(HistoryServer.scala:149) at org.apache.spark.deploy.history.HistoryServer$$anonfun$checkForLogs$3.apply(HistoryServer.scala:146) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) at org.apache.spark.deploy.history.HistoryServer.checkForLogs(HistoryServer.scala:146) at org.apache.spark.deploy.history.HistoryServer$$anon$1$$anonfun$run$1.apply$mcV$sp(HistoryServer.scala:77) at org.apache.spark.deploy.history.HistoryServer$$anon$1$$anonfun$run$1.apply(HistoryServer.scala:74) at org.apache.spark.deploy.history.HistoryServer$$anon$1$$anonfun$run$1.apply(HistoryServer.scala:74) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160) at org.apache.spark.deploy.history.HistoryServer$$anon$1.run(HistoryServer.scala:73) ReplayListenerBus: Malformed line: d64/jre/lib/jsse.jar:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/jce.jar:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/charsets.jar:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/rhino.jar:/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/jfr.jar:/usr/lib/jvm/java-7-openjdk-amd64/jre/classes","file.encoding":"ISO-8859-1","user.timezone":"Etc/UTC","java.specification.vendor":"Oracle Corporation","sun.java.launcher":"SUN_STANDARD","os.version":"3.13.0-32-generic","sun.os.patch.level":"unknown","java.vm.specification.vendor":"Oracle Corporation","user.country":"US","sun.jnu.encoding":"ISO-8859-1","user.language":"en","java.vendor.url":"http://java.oracle.com/","java.awt.printerjob":"sun.print.PSPrinterJob","java.awt.graphicsenv":"sun.awt.X11GraphicsEnvironment","awt.toolkit":"sun.awt.X11.XToolkit","os.name":"Linux","java.vm.vendor":"Oracle Corporation","java.vendor.url.bug":"http://bugreport.sun.com/bugreport/","user.name":"spark_user","java.vm.name":"OpenJDK 64-Bit Server VM","sun.java.command":"org.apache.spark.deploy.SparkSubmit target/scala-2.10/spark_test_2.10-1.0.jar --master mesos://192.168.100.1:5050 --class SparkTest /mapr/input/input.txt /mapr/spark_io/output","java.home":"/usr/lib/jvm/java-7-openjdk-amd64/jre","java.version":"1.7.0_65","sun.io.unicode.encoding":"UnicodeLittle"},"Classpath Entries":{"/opt/spark-1.0.2/assembly/target/scala-2.10/spark-assembly-1.0.2-hadoop2.0.0-mr1-cdh4.4.0.jar":"System Classpath","/opt/spark-1.0.2/conf":"System Classpath","http://192.168.100.1:46351/jars/spark_test_2.10-1.0.jar":"Added By User"}} -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/History-server-ERROR-ReplayListenerBus-Exception-in-parsing-Spark-event-log-tp14033.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org