It may also cause a problem when running in the yarn-client mode. If
--driver-memory is large, Yarn has to allocate a lot of memory to the AM
container, but AM doesn't really need the memory.
Boduo
--
View this message in context:
: Andrew Or and...@databricks.commailto:and...@databricks.com
Date: Wednesday, October 8, 2014 3:25 PM
To: Greg greg.h...@rackspace.commailto:greg.h...@rackspace.com
Cc: user@spark.apache.orgmailto:user@spark.apache.org
user@spark.apache.orgmailto:user@spark.apache.org
Subject: Re: Spark on YARN driver
@spark.apache.org user@spark.apache.org
Subject: Re: Spark on YARN driver memory allocation bug?
Hi Greg,
It does seem like a bug. What is the particular exception message that
you see?
Andrew
2014-10-08 12:12 GMT-07:00 Greg Hill greg.h...@rackspace.com:
So, I think this is a bug, but I wanted
Hi Greg,
It does seem like a bug. What is the particular exception message that you
see?
Andrew
2014-10-08 12:12 GMT-07:00 Greg Hill greg.h...@rackspace.com:
So, I think this is a bug, but I wanted to get some feedback before I
reported it as such. On Spark on YARN, 1.1.0, if you specify