Sean,

Thanks a lot for the important information, especially  userClassPathFirst.

Cheers,
Emre

On Wed, Dec 24, 2014 at 3:38 PM, Sean Owen <so...@cloudera.com> wrote:

> That could well be it -- oops, I forgot to run with the YARN profile
> and so didn't see the YARN dependencies. Try the userClassPathFirst
> option to try to make your app's copy take precedence.
>
> The second error is really a JVM bug, but, is from having too little
> memory available for the unit tests.
>
>
> http://spark.apache.org/docs/latest/building-spark.html#setting-up-mavens-memory-usage
>
> On Wed, Dec 24, 2014 at 1:56 PM, Emre Sevinc <emre.sev...@gmail.com>
> wrote:
> > It seems like YARN depends an older version of Jersey, that is 1.9:
> >
> >   https://github.com/apache/spark/blob/master/yarn/pom.xml
> >
> > When I've modified my dependencies to have only:
> >
> >   <dependency>
> >       <groupId>com.sun.jersey</groupId>
> >       <artifactId>jersey-core</artifactId>
> >       <version>1.9.1</version>
> >     </dependency>
> >
> > And then modified the code to use the old Jersey API:
> >
> >     Client c = Client.create();
> >     WebResource r = c.resource("http://localhost:2222/rest";)
> >                      .path("annotate")
> >                      .queryParam("text",
> > UrlEscapers.urlFragmentEscaper().escape(spotlightSubmission))
> >                      .queryParam("confidence", "0.3");
> >
> >     logger.warn("!!! DEBUG !!! target: {}", r.getURI());
> >
> >     String response = r.accept(MediaType.APPLICATION_JSON_TYPE)
> >                        //.header("")
> >                        .get(String.class);
> >
> >     logger.warn("!!! DEBUG !!! Spotlight response: {}", response);
> >
> > It seems to work when I use spark-submit to submit the application that
> > includes this code.
> >
> > Funny thing is, now my relevant unit test does not run, complaining about
> > not having enough memory:
> >
> > Java HotSpot(TM) 64-Bit Server VM warning: INFO:
> > os::commit_memory(0x00000000c4900000, 25165824, 0) failed; error='Cannot
> > allocate memory' (errno=12)
> > #
> > # There is insufficient memory for the Java Runtime Environment to
> continue.
> > # Native memory allocation (mmap) failed to map 25165824 bytes for
> > committing reserved memory.
> >
> > --
> > Emre
> >
> >
> > On Wed, Dec 24, 2014 at 1:46 PM, Sean Owen <so...@cloudera.com> wrote:
> >>
> >> Your guess is right, that there are two incompatible versions of
> >> Jersey (or really, JAX-RS) in your runtime. Spark doesn't use Jersey,
> >> but its transitive dependencies may, or your transitive dependencies
> >> may.
> >>
> >> I don't see Jersey in Spark's dependency tree except from HBase tests,
> >> which in turn only appear in examples, so that's unlikely to be it.
> >> I'd take a look with 'mvn dependency:tree' on your own code first.
> >> Maybe you are including JavaEE 6 for example?
> >>
> >> On Wed, Dec 24, 2014 at 12:02 PM, Emre Sevinc <emre.sev...@gmail.com>
> >> wrote:
> >> > Hello,
> >> >
> >> > I have a piece of code that runs inside Spark Streaming and tries to
> get
> >> > some data from a RESTful web service (that runs locally on my
> machine).
> >> > The
> >> > code snippet in question is:
> >> >
> >> >      Client client = ClientBuilder.newClient();
> >> >      WebTarget target = client.target("http://localhost:2222/rest";);
> >> >      target = target.path("annotate")
> >> >                  .queryParam("text",
> >> > UrlEscapers.urlFragmentEscaper().escape(spotlightSubmission))
> >> >                  .queryParam("confidence", "0.3");
> >> >
> >> >       logger.warn("!!! DEBUG !!! target: {}",
> >> > target.getUri().toString());
> >> >
> >> >       String response =
> >> >
> >> >
> target.request().accept(MediaType.APPLICATION_JSON_TYPE).get(String.class);
> >> >
> >> >       logger.warn("!!! DEBUG !!! Spotlight response: {}", response);
> >> >
> >> > When run inside a unit test as follows:
> >> >
> >> >      mvn clean test -Dtest=SpotlightTest#testCountWords
> >> >
> >> > it contacts the RESTful web service and retrieves some data as
> expected.
> >> > But
> >> > when the same code is run as part of the application that is submitted
> >> > to
> >> > Spark, using spark-submit script I receive the following error:
> >> >
> >> >       java.lang.NoSuchMethodError:
> >> >
> >> >
> javax.ws.rs.core.MultivaluedMap.addAll(Ljava/lang/Object;[Ljava/lang/Object;)V
> >> >
> >> > I'm using Spark 1.1.0 and for consuming the web service I'm using
> Jersey
> >> > in
> >> > my project's pom.xml:
> >> >
> >> >      <dependency>
> >> >       <groupId>org.glassfish.jersey.containers</groupId>
> >> >       <artifactId>jersey-container-servlet-core</artifactId>
> >> >       <version>2.14</version>
> >> >     </dependency>
> >> >
> >> > So I suspect that when the application is submitted to Spark, somehow
> >> > there's a different JAR in the environment that uses a different
> version
> >> > of
> >> > Jersey / javax.ws.rs.*
> >> >
> >> > Does anybody know which version of Jersey / javax.ws.rs.*  is used in
> >> > the
> >> > Spark environment, or how to solve this conflict?
> >> >
> >> >
> >> > --
> >> > Emre Sevinç
> >> > https://be.linkedin.com/in/emresevinc/
> >> >
> >
> >
> >
> >
> > --
> > Emre Sevinc
>



-- 
Emre Sevinc

Reply via email to