Right, but the spark version is with the scope provided. It means you have to provide it yourself. Can you check where the spark jar comes from with mvn dependency:tree. It's possible that you have spark jar files two times.

Can you share your pom.xml ?

Regards
JB

On 06/03/2016 12:36 AM, Pawel Szczur wrote:
In the example project Ismael linked, the Beam is a dependency:
https://github.com/iemejia/beam-playground/blob/master/pom.xml#L35

2016-06-03 0:23 GMT+02:00 Amit Sela <[email protected]
<mailto:[email protected]>>:

    You could start with the examples in the README
    <https://github.com/apache/incubator-beam/tree/master/runners/spark>
    In addition if you want to run a pipeline in the IDE you should
    either do it from src/test/... or create a new project and add Beam
    as a dependency.

    On Fri, Jun 3, 2016 at 1:18 AM Pawel Szczur <[email protected]
    <mailto:[email protected]>> wrote:

        I don't understand.

        I've tried to run it from IDE. Could you explain how do you run it?

        2016-06-03 0:08 GMT+02:00 Amit Sela <[email protected]
        <mailto:[email protected]>>:

            Are you testing under package test ?


            On Fri, Jun 3, 2016, 01:06 Pawel Szczur
            <[email protected] <mailto:[email protected]>> wrote:

                I've just tested, it throws an exception with nightly Beam:
                Exception in thread "main"
                java.lang.NoClassDefFoundError:
                org/apache/spark/api/java/JavaSparkContext


                2016-06-01 14:23 GMT+02:00 Ismaël Mejía
                <[email protected] <mailto:[email protected]>>:

                    You can find my pom.xml file for my beam-playground
                    project that runs in the spark runner, note that it
                    uses the daily snapshots, but I will change it for
                    the released jars once they are on maven central:

                    
https://github.com/iemejia/beam-playground/blob/master/pom.xml

                    Regards,
                    Ismaël

                    On Wed, Jun 1, 2016 at 3:01 AM, Pawel Szczur
                    <[email protected]
                    <mailto:[email protected]>> wrote:

                        Hi,

                        A Beam can be easily configured to run
                        against Direct, Dataflow and Flink, but I
                        couldn't get Spark to work.

                        Here's a repo I've prepared for a bug
                        reproduction, it may serve as init:
                        https://github.com/orian/cogroup-wrong-grouping

                        Could someone modify it or share some working
                        example (outside of Beam repo).

                        Cheers, Pawel






--
Jean-Baptiste Onofré
[email protected]
http://blog.nanthrax.net
Talend - http://www.talend.com

Reply via email to