Hi!

I using java7, I found the problem. I not run start and await termination
on streaming context, now it's work BUT
spark-submit never return (it's run in the foreground and receive the kafka
streams)... what I miss?
(I want to send the job to standalone cluster worker process)

b0c1

----------------------------------------------------------------------------------------------------------------------------------
Skype: boci13, Hangout: boci.b...@gmail.com


On Sat, Jul 19, 2014 at 3:32 PM, Sean Owen <so...@cloudera.com> wrote:

> Are you building / running with Java 6? I imagine your .jar files has
> more than 65536 files, and Java 6 has various issues with jars this
> large. If possible, use Java 7 everywhere.
>
> https://issues.apache.org/jira/browse/SPARK-1520
>
> On Sat, Jul 19, 2014 at 2:30 PM, boci <boci.b...@gmail.com> wrote:
> > Hi Guys,
> >
> > I try to create spark uber jar with sbt but I have a lot of problem... I
> > want to use the following:
> > - Spark streaming
> > - Kafka
> > - Elsaticsearch
> > - HBase
> >
> > the current jar size is cca 60M and it's not working.
> > - When I deploy with spark-submit: It's running and exit without any
> error
> > - When I try to start with local[*]  mode, it's say:
> >  Exception in thread "main" java.lang.NoClassDefFoundError:
> > org/apache/spark/Logging
> > => but I start with java -cp /.../spark-assembly-1.0.1-hadoop2.2.0.jar
> -jar
> > my.jar
> >
> > Any idea how can solve this? (which lib required to set provided wich
> > required for run... later I want to run this jar in yarn cluster)
> >
> > b0c1
> >
> ----------------------------------------------------------------------------------------------------------------------------------
> > Skype: boci13, Hangout: boci.b...@gmail.com
>

Reply via email to