As I said, I already did the change on a branch.

Let me submit the PR.

Regards
JB

On 05/18/2016 03:40 PM, Ismaël Mejía wrote:
I can take care of that, and other 'dataflow' mentions in the runner if
you agree.

On Wed, May 18, 2016 at 2:30 PM, Jean-Baptiste Onofré <[email protected]
<mailto:[email protected]>> wrote:

    We should change the property name by the way (dataflow => beam).

    I think I already did it on a local branch.

    Regards
    JB

    On 05/18/2016 12:33 PM, Amit Sela wrote:

        You can pass this system property:
        "dataflow.spark.test.reuseSparkContext=true"
        And this will reuse the context, see:
        
https://github.com/apache/incubator-beam/blob/d627266d8d39ff0ec94dc9f3f84893c1026abde7/runners/spark/src/main/java/org/apache/beam/runners/spark/translation/SparkContextFactory.java#L35

        On Wed, May 18, 2016 at 1:28 PM Ismaël Mejía <[email protected]
        <mailto:[email protected]>
        <mailto:[email protected] <mailto:[email protected]>>> wrote:

             Hello,

             I am trying to run a set of tests that use the spark
        runner, I build
             the Pipeline in a setUp method and then reuse it in
        different tests,
             however when it is invoked for the second time it throws an
        exception:

             java.lang.RuntimeException:
        org.apache.spark.SparkException: Only
             one SparkContext may be running in this JVM (see
        SPARK-2243). To
             ignore this error, set spark.driver.allowMultipleContexts =
        true.
             The currently running SparkContext was created at:

             Do you know how can I pass such variable to the runner, or
        if I can
             skip this issue in another way ?

             -Ismael


    --
    Jean-Baptiste Onofré
    [email protected] <mailto:[email protected]>
    http://blog.nanthrax.net
    Talend - http://www.talend.com



--
Jean-Baptiste Onofré
[email protected]
http://blog.nanthrax.net
Talend - http://www.talend.com

Reply via email to