Related question: what are good profiling tools other than watching along the
application master with the running code?
Are there things that can be logged during the run? If I have say 2 ways of
accomplishing the same thing, and I want to learn about the time/memory/general
resource blocking
Am not really sure of the best practices on this , but I either consult the
localhost:4040/jobs/ etc
or better this :
val customSparkListener: CustomSparkListener = new CustomSparkListener()
sc.addSparkListener(customSparkListener)
class CustomSparkListener extends SparkListener {
override def
Have you looked at:
https://spark.apache.org/docs/latest/running-on-yarn.html#debugging-your-application
If you use Mesos:
https://spark.apache.org/docs/latest/running-on-mesos.html#troubleshooting-and-debugging
On Wed, Aug 3, 2016 at 6:13 PM, glen wrote:
> Any tool like gdb?
Any tool like gdb? Which support break point at some line or some function?