You could try increase the driver memory by "--driver-memory", looks like
the OOM is came from driver side, so the simple solution is to increase the
memory of driver.
On Tue, Jan 19, 2016 at 1:15 PM, Julio Antonio Soto wrote:
> Hi,
>
> I'm having trouble when uploadig spark
Hi,
I'm having trouble when uploadig spark jobs in yarn-cluster mode. While the
job works and completes in yarn-client mode, I hit the following error when
using spark-submit in yarn-cluster (simplified):
16/01/19 21:43:31 INFO hive.metastore: Connected to metastore.
16/01/19 21:43:32 WARN
Hi,
I tried with --driver-memory 16G (more than enough to read a simple parquet
table), but the problem still persists.
Everything works fine in yarn-client.
--
Julio Antonio Soto de Vicente
> El 19 ene 2016, a las 22:18, Saisai Shao escribió:
>
> You could try