Hi Charles, We are using Ambari for hadoop / spark services management, version and monitoring in cluster.
For Spark jobs and cluster hosts, discs, memory, cpu, network realtime monitoring we use graphite + grafana + collectd + spark metrics http://www.hammerlab.org/2015/02/27/monitoring-spark-with-graphite-and-grafana/ BR, Arkadiusz Bicz On Thu, Jan 21, 2016 at 5:33 AM, charles li <charles.up...@gmail.com> wrote: > I've put a thread before: pre-install 3-party Python package on spark > cluster > > currently I use Fabric to manage my cluster , but it's not enough for me, > and I believe there is a much better way to manage and monitor the cluster. > > I believe there really exists some open source manage tools which provides a > web UI allowing me to [ what I need exactly ]: > > monitor the cluster machine's state in real-time, say memory, network, disk > list all the services, packages on each machine > install / uninstall / upgrade / downgrade package through a web UI > start / stop / restart services on that machine > > > > great thanks > > -- > -------------------------------------- > a spark lover, a quant, a developer and a good man. > > http://github.com/litaotao --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org