A recent pull request added a classmethod to PySpark's SparkContext that
allows you to configure the Java system properties from Python:

https://github.com/apache/incubator-spark/pull/97


On Wed, Nov 20, 2013 at 10:34 AM, Patrick Wendell <pwend...@gmail.com>wrote:

> You can add java options in SPARK_JAVA_OPTS inside of conf/spark-env.sh
>
>
> http://spark.incubator.apache.org/docs/latest/python-programming-guide.html#installing-and-configuring-pyspark
>
> - Patrick
>
> On Wed, Nov 20, 2013 at 8:52 AM, Michal Romaniuk
> <michal.romaniu...@imperial.ac.uk> wrote:
> > The info about configuration options is available at the link below, but
> > this seems to only work with Java. How can those options be set from
> Python?
> >
> >
> http://spark.incubator.apache.org/docs/latest/configuration.html#system-properties
> >
> > Thanks,
> > Michal
>

Reply via email to