The table data is cached in block managers on executors. Could you paste
the log on your driver about OOM ?
On Thu, Mar 31, 2016 at 1:24 PM, Soam Acharya wrote:
> Hi folks,
>
> I understand that invoking sqlContext.cacheTable("tableName") will load
> the table into a
Hi folks,
I understand that invoking sqlContext.cacheTable("tableName") will load the
table into a compressed in-memory columnar format. When Spark is launched
via spark shell in YARN client mode, is the table loaded into the local
Spark driver process in addition to the executors in the Hadoop