It should work when the property is set BEFORE creating the
StreamingContext. Or if you explicitly creating a SparkContext and then
creating a StreamingContext with the SparkContext, then the configuration
must be set BFEORE the SparkContext is created. With 0.9, you can also use
the SparkConf object to set the configuraiton and then create a
SparkContext or StreamingContext from the configuration (see the various
constructors of these contexts).

TD

On Tue, Feb 11, 2014 at 6:21 AM, Kal El <pinu.datri...@yahoo.com> wrote:

> I am trying to set the available memory on a stand alone spark streaming
> job and had no success.
>
> I have tried these options on spark-env.sh
> export SPARK_DAEMON_MEMORY=15g
> export SPARK_WORKER_MEMORY=15g
> export SPARK_DAEMON_JAVA_OPTS="-Xms20g -Xmx20g"
> export SPARK_JAVA_OPTS="-Xms20g -Xmx20g"
>
> Also tried this inside the scala file:
> "System.setProperty("spark.executor.memory","5g")". This option worked on a
> normal spark job running on a cluster. However, it will not work with spark
> streaming.
>
> Has anyone managed to run a spark streaming job with a custom value of
> java memory ?
> If so, how did you do it ?
>
> Thanks
>

Reply via email to