If you are launching your application with spark-submit you can manually edit the spark-class file to make it 1g as baseline. It’s pretty easy to do and to figure out how once you open the file. This worked for me even if it’s not a final solution of course.
Gianluca On 12 Jun 2014, at 15:16, ericjohnston1989 <ericjohnston1...@gmail.com> wrote: > Hey everyone, > > I'm having some trouble increasing the default storage size for a broadcast > variable. It looks like it defaults to a little less than 512MB every time, > and I can't figure out which configuration to change to increase this. > > INFO storage.MemoryStore: Block broadcast_0 stored as values to memory > (estimated size 426.5 MB, free 64.2 MB) > > (I'm seeing this in the terminal on my driver computer) > > I can change "spark.executor.memory", and that seems to increase the amount > of RAM available on my nodes, but it doesn't seem to adjust this storage > size for my broadcast variables. Any ideas? > > Thanks, > > Eric > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Increase-storage-MemoryStore-size-tp7516.html > Sent from the Apache Spark User List mailing list archive at Nabble.com.