Hi David,
Can you use persist instead? Perhaps with some other StorageLevel? It
worked with Spark 2.2.0-SNAPSHOT I use and don't remember how it
worked back then in 1.6.2.
You could also check the Executors tab and see how many blocks you
have in their BlockManagers.
Pozdrawiam,
Jacek Laskowski
I have tried the following code but didn't see anything on the storage tab.
val myrdd = sc.parallelilize(1 to 100)
myrdd.setName("my_rdd")
myrdd.cache()
myrdd.collect()
Storage tab is empty, though I can see the stage of collect() .
I am using 1.6.2 ,HDP 2.5 , spark on yarn
Thanks David