+1 for Jacek's suggestion

FWIW: another possible *hacky* way is to write a package
in org.apache.spark.sql namespace so it can access the
sparkSession.sharedState.cacheManager. Then use scala reflection to read
the cache manager's `cachedData` field, which can provide the list of
cached relations.

https://github.com/apache/spark/blob/v2.1.0/sql/core/src/main/scala/org/apache/spark/sql/execution/CacheManager.scala#L47

But this makes use of spark internals so would be subject to changes of it.

On Fri, Jan 27, 2017 at 7:00 AM, Jacek Laskowski <ja...@japila.pl> wrote:

> Hi,
>
> I think that the only way to get the information about a cached RDD is to
> use SparkListener and intercept respective events about cached blocks on
> BlockManagers.
>
> Jacek
>
> On 25 Jan 2017 5:54 a.m., "kumar r" <kumarc...@gmail.com> wrote:
>
> Hi,
>
> I have cached some table in Spark Thrift Server. I want to get all cached
> table information. I can see it in 4040 web ui port.
>
> Is there any command or other way to get the cached table details
> programmatically?
>
> Thanks,
> Kumar
>
>
>

Reply via email to