I want to measure how long it takes some different transformations in Spark
as map, joinWithCassandraTable and so on.  Which one is the best
aproximation to do it?

def time[R](block: => R): R = {
    val t0 = System.nanoTime()
    val result = block
    val t1 = System.nanoTime()
    println("Elapsed time: " + (t1 - t0) + "ns")
    result}


Could I use something like this?? I guess that the System.nanoTime will be
executed in the driver before and after the workers execute the maps/joins
and so on. Is it right? any other idea?

Reply via email to