Re: Measuring Performance in Spark
Is there any tools like Ganglia that I can use to get performance on Spark or I need to do it myself? Thanks! -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Measuring-Performance-in-Spark-tp17376p17836.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Measuring Performance in Spark
Hi Mahsa, Use SPM http://sematext.com/spm/. See http://blog.sematext.com/2014/10/07/apache-spark-monitoring/ . Otis -- Monitoring * Alerting * Anomaly Detection * Centralized Log Management Solr Elasticsearch Support * http://sematext.com/ On Fri, Oct 31, 2014 at 1:00 PM, mahsa mahsa.han...@gmail.com wrote: Is there any tools like Ganglia that I can use to get performance on Spark or I need to do it myself? Thanks! -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Measuring-Performance-in-Spark-tp17376p17836.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Measuring Performance in Spark
Oh this is Awesome! exactly what I needed! Thank you Otis! -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Measuring-Performance-in-Spark-tp17376p17839.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Measuring Performance in Spark
One approach would be to write pure mapReduce and spark jobs (eg like wordcounts, filter, join, groupBy etc) and benchmark them. Another would be to pick something that runs on top of mapReduce/Spark and benchmark on it. (like benchmark against hive and sparkSQL) Thanks Best Regards On Mon, Oct 27, 2014 at 10:52 PM, mahsa mahsa.han...@gmail.com wrote: Hi, I want to test the performance of MapReduce, and Spark on a program, find the bottleneck, calculating the performance of each part of the program and etc. I was wondering if there is tool for the measurement like Galia and etc. to help me in this regard. Thanks! -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Measuring-Performance-in-Spark-tp17376.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Measuring Performance in Spark
Thanks Akhil, So there is no tool that I can use right? My program is overloading some operators for some operation on images. I need to be accurate in the result. I try to work on your offered approach. Thanks. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Measuring-Performance-in-Spark-tp17376p17507.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Measuring Performance in Spark
Hi, I want to test the performance of MapReduce, and Spark on a program, find the bottleneck, calculating the performance of each part of the program and etc. I was wondering if there is tool for the measurement like Galia and etc. to help me in this regard. Thanks! -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Measuring-Performance-in-Spark-tp17376.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org