Hi,

I am working on streaming application. 
I tried to configure history server to persist the events of application in
hadoop file system (hdfs). However, it is not logging any events.
I am running Apache Spark 1.4.1 (pyspark) under Ubuntu 14.04 with three
nodes.
Here is my configuration:
File - /usr/local/spark/conf/spark-defaults.conf    #In all three nodes
spark.eventLog.enabled true
spark.eventLog.dir hdfs://master-host:port/usr/local/hadoop/spark_log

#in master node
export
SPARK_HISTORY_OPTS="-Dspark.history.fs.logDirectory=hdfs://host:port/usr/local/hadoop/spark_log"

Can someone give list of steps to configure history server.

Thanks and regards,
b.bhavesh





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/History-server-is-not-receiving-any-event-tp24426.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to