hey patrick,
i have a SparkConf i can add them too. i was looking for a way to do this
where they are not hardwired within scala, which is what SPARK_JAVA_OPTS
used to do.
i guess if i just set -Dspark.akka.frameSize=10000 on my java app launch
then it will get picked up by the SparkConf too right?


On Wed, May 14, 2014 at 2:54 PM, Patrick Wendell <pwend...@gmail.com> wrote:

> Just wondering - how are you launching your application? If you want
> to set values like this the right way is to add them to the SparkConf
> when you create a SparkContext.
>
> val conf = new SparkConf().set("spark.akka.frameSize",
> "10000").setAppName(...).setMaster(...)
> val sc = new SparkContext(conf)
>
> - Patrick
>
> On Wed, May 14, 2014 at 9:09 AM, Koert Kuipers <ko...@tresata.com> wrote:
> > i have some settings that i think are relevant for my application. they
> are
> > spark.akka settings so i assume they are relevant for both executors and
> my
> > driver program.
> >
> > i used to do:
> > SPARK_JAVA_OPTS="-Dspark.akka.frameSize=10000"
> >
> > now this is deprecated. the alternatives mentioned are:
> > * some spark-submit settings which are not relevant to me since i do not
> use
> > spark-submit (i launch spark jobs from an existing application)
> > * spark.executor.extraJavaOptions to set -X options. i am not sure what
> -X
> > options are, but it doesnt sound like what i need, since its only for
> > executors
> > * SPARK_DAEMON_OPTS to set java options for standalone daemons (i.e.
> master,
> > worker), that sounds like i should not use it since i am trying to change
> > settings for an app, not a daemon.
> >
> > am i missing the correct setting to use?
> > should i do -Dspark.akka.frameSize=10000 on my application launch
> directly,
> > and then also set spark.executor.extraJavaOptions? so basically repeat
> it?
>

Reply via email to