Hi Andrew,

I'm running something close to the present master (I compiled several days
ago) but am having some trouble viewing history.

I set "spark.eventLog.dir" to true, but continually receive the error
message (via the web UI) "Application history not found...No event logs
found for application ml-pipeline in
file:/tmp/spark-events/ml-pipeline-1408117588599".  I tried 2 fixes:

-I manually set "spark.eventLog.dir" to a path beginning with "file:///",
believe that perhaps the problem was an invalid protocol specification.

-I inspected /tmp/spark-events manually and noticed that each job directory
(and the files there-in) were owned by the user who launched the job and
were not world readable.  Since I run Spark from a dedicated Spark user, I
set the files world readable but I still receive the same "Application
history not found" error.

Is there a configuration step I may be missing?

-Brad


On Thu, Aug 14, 2014 at 7:33 PM, Andrew Or <and...@databricks.com> wrote:

> Hi SK,
>
> Not sure if I understand you correctly, but here is how the user normally
> uses the event logging functionality:
>
> After setting "spark.eventLog.enabled" and optionally
> "spark.eventLog.dir", the user runs his/her Spark application and calls
> sc.stop() at the end of it. Then he/she goes to the standalone Master UI
> (under http://<master-url>:8080 by default) and click on the application
> under the Completed Applications table. This will link to the Spark UI of
> the finished application in its completed state, under a path that looks
> like "http://<master-url>:8080/history/<app-Id>". It won't be on "
> http://localhost:4040"; anymore because the port is now freed for new
> applications to bind their SparkUIs to. To access the file that stores the
> raw statistics, go to the file specified in "spark.eventLog.dir". This is
> by default "/tmp/spark-events", though in Spark 1.0.1 it may be in HDFS
> under the same path.
>
> I could be misunderstanding what you mean by the stats being buried in the
> console output, because the events are not logged to the console but to a
> file in "spark.eventLog.dir". For all of this to work, of course, you have
> to run Spark in standalone mode (i.e. with master set to
> spark://<master-url>:7077). In other modes, you will need to use the
> history server instead.
>
> Does this make sense?
> Andrew
>
>
> 2014-08-14 18:08 GMT-07:00 SK <skrishna...@gmail.com>:
>
> More specifically, as indicated by Patrick above, in 1.0+, apps will have
>> persistent state so that the UI can be reloaded. Is there a way to enable
>> this feature in 1.0.1?
>>
>> thanks
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-webUI-application-details-page-tp3490p12157.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to