Will sbt-pack and the maven solution work for the Scala REPL? 

I need the REPL because it save a lot of time when I'm playing with large data 
sets because I load then once, cache them and then try out things interactively 
before putting in a standalone driver. 

I've sbt woking for my own driver program on Spark 0.9. 



> On May 11, 2014, at 3:49 PM, Stephen Boesch <java...@gmail.com> wrote:
> 
> Just discovered sbt-pack: that addresses (quite well) the last item for 
> identifying and packaging the external jars.
> 
> 
> 2014-05-11 12:34 GMT-07:00 Stephen Boesch <java...@gmail.com>:
>> HI Sonal,
>>     Yes I am working towards that same idea.  How did you go about creating 
>> the non-spark-jar dependencies ?  The way I am doing it is a separate 
>> straw-man project that does not include spark but has the external third 
>> party jars included. Then running sbt compile:managedClasspath and reverse 
>> engineering the lib jars from it.  That is obviously not ideal.
>> 
>> The maven "run" will be useful for other projects built by maven: i will 
>> keep in my notes.
>> 
>> AFA sbt run-example, it requires additional libraries to be added for my 
>> external dependencies.  I tried several items including  ADD_JARS,  
>> --driver-class-path  and combinations of extraClassPath. I have deferred 
>> that ad-hoc approach to finding a systematic one.
>> 
>> 
>> 
>> 
>> 2014-05-08 5:26 GMT-07:00 Sonal Goyal <sonalgoy...@gmail.com>:
>> 
>>> I am creating a jar with only my dependencies and run spark-submit through 
>>> my project mvn build. I have configured the mvn exec goal to the location 
>>> of the script. Here is how I have set it up for my app. The mainClass is my 
>>> driver program, and I am able to send my custom args too. Hope this helps.
>>> 
>>> <plugin>
>>> <groupId>org.codehaus.mojo</groupId>
>>> <artifactId>exec-maven-plugin</artifactId>
>>> <executions>
>>> <execution>
>>>     <goals>
>>>             <goal>exec</goal>
>>>     </goals>
>>>     </execution>
>>> </executions>
>>> <configuration>
>>>            <executable>/home/sgoyal/spark/bin/spark-submit</executable>
>>>             <arguments>
>>>                     <argument>${jars}</argument>
>>>                         <argument>--class</argument>
>>>                         <argument>${mainClass}</argument>
>>>                     <argument>--arg</argument>
>>>                         <argument>${spark.master}</argument>
>>>                         <argument>--arg</argument>
>>>                         <argument>${my app arg 1}</argument>
>>>                         <argument>--arg</argument>
>>>                         <argument>${my arg 2}</argument>
>>>             </arguments>
>>> </configuration>
>>> </plugin>
>>> 
>>> 
>>> Best Regards,
>>> Sonal
>>> Nube Technologies 
>>> 
>>> 
>>> 
>>> 
>>> 
>>> 
>>>> On Wed, May 7, 2014 at 6:57 AM, Tathagata Das 
>>>> <tathagata.das1...@gmail.com> wrote:
>>>> Doesnt the run-example script work for you? Also, are you on the latest 
>>>> commit of branch-1.0 ?
>>>> 
>>>> TD
>>>> 
>>>> 
>>>>> On Mon, May 5, 2014 at 7:51 PM, Soumya Simanta <soumya.sima...@gmail.com> 
>>>>> wrote:
>>>>> 
>>>>> 
>>>>> Yes, I'm struggling with a similar problem where my class are not found 
>>>>> on the worker nodes. I'm using 1.0.0_SNAPSHOT.  I would really appreciate 
>>>>> if someone can provide some documentation on the usage of spark-submit.
>>>>> 
>>>>> Thanks
>>>>> 
>>>>> > On May 5, 2014, at 10:24 PM, Stephen Boesch <java...@gmail.com> wrote:
>>>>> >
>>>>> >
>>>>> > I have a spark streaming application that uses the external streaming 
>>>>> > modules (e.g. kafka, mqtt, ..) as well.  It is not clear how to 
>>>>> > properly invoke the spark-submit script: what are the 
>>>>> > ---driver-class-path and/or -Dspark.executor.extraClassPath parameters 
>>>>> > required?
>>>>> >
>>>>> >  For reference, the following error is proving difficult to resolve:
>>>>> >
>>>>> > java.lang.ClassNotFoundException: 
>>>>> > org.apache.spark.streaming.examples.StreamingExamples
>>>>> >
> 

Reply via email to