Chester - I'm happy rebuilding Spark, but then how can I deploy it to EC2?

On 7/7/14, Chester Chen <ches...@alpinenow.com> wrote:
> Have you tried to change the spark SBT scripts? You can change the
> dependency scope to "provided".  This similar to compile scope, except JDK
> or container need to provide the dependency at runtime.
>
> This assume the Spark will work with the new version of common libraries.
>
> Of course, this is not a general solution even it works ( if may not work).
>
> Chester
>
>
>
>
> On Mon, Jul 7, 2014 at 10:31 AM, Robert James <srobertja...@gmail.com>
> wrote:
>
>> spark-submit includes a spark-assembly uber jar, which has older
>> versions of many common libraries.  These conflict with some of the
>> dependencies we need.  I have been racking my brain trying to find a
>> solution (including experimenting with ProGuard), but haven't been
>> able to: when we use spark-submit, we get NoMethodErrors, even though
>> the code compiles fine, because the runtime classes are different than
>> the compile time classes!
>>
>> Can someone recommend a solution? We are using scala, sbt, and
>> sbt-assembly, but are happy using another tool (please provide
>> instructions how to).
>>
>

Reply via email to