Re: Spark on YARN driver memory allocation bug?

2014-10-17 Thread Boduo Li
It may also cause a problem when running in the yarn-client mode. If --driver-memory is large, Yarn has to allocate a lot of memory to the AM container, but AM doesn't really need the memory. Boduo -- View this message in context:

Re: Spark on YARN driver memory allocation bug?

2014-10-09 Thread Greg Hill
: Andrew Or and...@databricks.commailto:and...@databricks.com Date: Wednesday, October 8, 2014 3:25 PM To: Greg greg.h...@rackspace.commailto:greg.h...@rackspace.com Cc: user@spark.apache.orgmailto:user@spark.apache.org user@spark.apache.orgmailto:user@spark.apache.org Subject: Re: Spark on YARN driver

Re: Spark on YARN driver memory allocation bug?

2014-10-09 Thread Sandy Ryza
@spark.apache.org user@spark.apache.org Subject: Re: Spark on YARN driver memory allocation bug? Hi Greg, It does seem like a bug. What is the particular exception message that you see? Andrew 2014-10-08 12:12 GMT-07:00 Greg Hill greg.h...@rackspace.com: So, I think this is a bug, but I wanted

Re: Spark on YARN driver memory allocation bug?

2014-10-08 Thread Andrew Or
Hi Greg, It does seem like a bug. What is the particular exception message that you see? Andrew 2014-10-08 12:12 GMT-07:00 Greg Hill greg.h...@rackspace.com: So, I think this is a bug, but I wanted to get some feedback before I reported it as such. On Spark on YARN, 1.1.0, if you specify