Where is the history server running? Is it running on the same node as the
logs directory.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-webUI-application-details-page-tp3490p21374.html
Sent from the Apache Spark User List mailing list archive at
Hi,
I do'nt have any history server running. As SK's already pointed in a
previous post the history server seems to be required only in mesos or yarn
mode, not in standalone mode.
https://spark.apache.org/docs/1.1.1/monitoring.html
If Spark is run on Mesos or YARN, it is still possible to
Hi,
I've a similar problem. I want to see the detailed logs of Completed
Applications so I've set in my program :
set(spark.eventLog.enabled,true).
set(spark.eventLog.dir,file:/tmp/spark-events)
but when I click on the application in the webui, I got a page with the
message :
Application history
Perhaps you need to set this in your spark-defaults.conf so that¹s it¹s
already set when your slave/worker processes start.
-Joe
On 1/25/15, 6:50 PM, ilaxes ila...@hotmail.com wrote:
Hi,
I've a similar problem. I want to see the detailed logs of Completed
Applications so I've set in my program
How did you specify the HDFS path? When i put
spark.eventLog.dir hdfs://
crosby.research.intel-research.net:54310/tmp/spark-events
in my spark-defaults.conf file, I receive the following error:
An error occurred while calling
None.org.apache.spark.api.java.JavaSparkContext.
:
Hi All,
@Andrew
Thanks for the tips. I just built the master branch of Spark last
night, but am still having problems viewing history through the
standalone UI. I dug into the Spark job events directories as you
suggested, and I see at a minimum 'SPARK_VERSION_1.0.0' and
'EVENT_LOG_1'; for
I was able to recently solve this problem for standalone mode. For this mode,
I did not use a history server. Instead, I set spark.eventLog.dir (in
conf/spark-defaults.conf) to a directory in hdfs (basically this directory
should be in a place that is writable by the master and accessible globally
Have a look at the history server, looks like you have enabled history
server on your local and not on the remote server.
http://people.apache.org/~tdas/spark-1.0.0-rc11-docs/monitoring.html
Thanks
Best Regards
On Tue, Aug 26, 2014 at 7:01 AM, SK skrishna...@gmail.com wrote:
Hi,
I am
I have already tried setting the history server and accessing it on
master-url:18080 as per the link. But the page does not list any completed
applications. As I mentioned in my previous mail, I am running Spark in
standalone mode on the cluster (as well as on my local machine). According
to the
Hi,
I am able to access the Application details web page from the master UI page
when I run Spark in standalone mode on my local machine. However, I am not
able to access it when I run Spark on our private cluster. The Spark master
runs on one of the nodes in the cluster. I am able to access the
Hi Andrew,
I'm running something close to the present master (I compiled several days
ago) but am having some trouble viewing history.
I set spark.eventLog.dir to true, but continually receive the error
message (via the web UI) Application history not found...No event logs
found for application
Hi,
Ok, I was specifying --master local. I changed that to --master
spark://localhostname:7077 and am now able to see the completed
applications. It provides summary stats about runtime and memory usage,
which is sufficient for me at this time.
However it doesn't seem to archive the info in
@Brad
Your configuration looks alright to me. We parse both file:/ and
file:/// the same way so that shouldn't matter. I just tried this on the
latest master and verified that it works for me. Can you dig into the
directory /tmp/spark-events/ml-pipeline-1408117588599 to make sure that
it's not
Hi,
I am using Spark 1.0.1. But I am still not able to see the stats for
completed apps on port 4040 - only for running apps. Is this feature
supported or is there a way to log this info to some file? I am interested
in stats about the total # of executors, total runtime, and total memory
used by
If I don't understand you wrong, setting event logging in the SPARK_JAVA_OPTS
should achieve what you want. I'm logging to the HDFS, but according to the
config page http://spark.apache.org/docs/latest/configuration.html a
folder should be possible as well.
Example with all other settings
Hi all,
As Simon explained, you need to set spark.eventLog.enabled to true.
I'd like to add that the usage of SPARK_JAVA_OPTS to set spark
configurations is deprecated. I'm sure many of you have noticed this from
the scary warning message we print out. :) The recommended and supported
way of
I set spark.eventLog.enabled to true in
$SPARK_HOME/conf/spark-defaults.conf and also configured the logging to a
file as well as console in log4j.properties. But I am not able to get the
log of the statistics in a file. On the console there is a lot of log
messages along with the stats - so
Hi SK,
Not sure if I understand you correctly, but here is how the user normally
uses the event logging functionality:
After setting spark.eventLog.enabled and optionally spark.eventLog.dir,
the user runs his/her Spark application and calls sc.stop() at the end of
it. Then he/she goes to the
This will be a feature in Spark 1.0 but is not yet released. In 1.0 Spark
applications can persist their state so that the UI can be reloaded after
they have completed.
- Patrick
On Sun, Mar 30, 2014 at 10:30 AM, David Thomas dt5434...@gmail.com wrote:
Is there a way to see 'Application
19 matches
Mail list logo