Hi Rodrick,
I had tried increasing memory from 6G to 9G to 12G but still I am getting
the same error. The size of dataframe I am trying to write is around 6-7 G
and the Hive table is Parquet format.
Thanks,
Bijay
On Mon, Apr 11, 2016 at 4:03 AM, Rodrick Brown
Try increasing the memory allocated for this job.
Sent from Outlook for iPhone
On Sun, Apr 10, 2016 at 9:12 PM -0700, "Bijay Kumar Pathak"
wrote:
Hi,
I am running Spark 1.6 on EMR. I have workflow which does the following
things:Read the 2 flat file, create the
Hi,
I am running Spark 1.6 on EMR. I have workflow which does the following
things:
1. Read the 2 flat file, create the data frame and join it.
2. Read the particular partition from the hive table and joins the
dataframe from 1 with it.
3. Finally, insert overwrite into hive table
Hi,
I am running Spark 1.6 on EMR. I have workflow which does the fiollowing
things:
1. Read the 2 flat file, create the data frame and join it.
2. Read the particular partition from the hive table and joins the
dataframe from 1 with it.
3. Finally, insert overwrite into hive table
Hi,
I am running Spark 1.6 on EMR. I have workflow which does the fiollowing
things:
1. Read the 2 flat file, create the data frame and join it.
2. Read the particular partition from the hive table and joins the
dataframe from 1 with it.
3. Finally, insert overwrite into hive table