Re: best practice : how to manage your Spark cluster ?

2016-01-21 Thread Arkadiusz Bicz
Hi Charles, We are using Ambari for hadoop / spark services management, version and monitoring in cluster. For Spark jobs and cluster hosts, discs, memory, cpu, network realtime monitoring we use graphite + grafana + collectd + spark metrics

best practice : how to manage your Spark cluster ?

2016-01-20 Thread charles li
I've put a thread before: pre-install 3-party Python package on spark cluster currently I use *Fabric* to manage my cluster , but it's not enough for me, and I believe there is a much better way to *manage and monitor* the cluster. I believe there really exists some open source manage tools