Hi,
You can use YourKit to profile workloads and please see:
https://cwiki.apache.org/confluence/display/SPARK/Profiling+Spark+Applications+Using+YourKit
// maropu
On Mon, Apr 25, 2016 at 10:24 AM, Edmon Begoli wrote:
> I am working on an experimental research into memory
I am working on an experimental research into memory use and profiling of
memory use and allocation by machine learning functions across number of
popular libraries.
Is there a facility within Spark, and MLlib specifically to track the
allocation and use of data frames/memory by MLlib?
Please