Hi, Laurent --

That's the way we package our Spark jobs (i.e., with Maven).  You'll need
something like this:

https://gist.github.com/prb/d776a47bd164f704eecb

That packages separate driver (which you can run with java -jar ...) and
worker JAR files.

Cheers.
-- Paul

—
p...@mult.ifario.us | Multifarious, Inc. | http://mult.ifario.us/


On Mon, May 12, 2014 at 8:41 AM, Laurent Thoulon <
laurent.thou...@ldmobile.net> wrote:

> Hi,
>
> I'm quite new to spark (and scala) but has anyone ever successfully
> compiled and run a spark job using java and maven ?
> Packaging seems to go fine but when i try to execute the job using
>
> mvn package
> java -Xmx4g -cp target/jobs-1.4.0.0-jar-with-dependencies.jar
> my.jobs.spark.TestJob
>
> I get the following error
> Exception in thread "main" com.typesafe.config.ConfigException$Missing: No
> configuration setting found for key 'akka.version'
>         at
> com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115)
>         at
> com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136)
>         at
> com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:142)
>         at
> com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
>         at
> com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
>         at
> com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:197)
>         at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:136)
>         at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
>         at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
>         at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
>         at
> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:96)
>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:126)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:139)
>         at
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:47)
>         at my.jobs.spark.TestJob.run(TestJob.java:56)
>
>
> Here's the code right until line 56
>
>         SparkConf conf = new SparkConf()
>             .setMaster("local[" + cpus + "]")
>             .setAppName(this.getClass().getSimpleName())
>             .setSparkHome("/data/spark")
>             .setJars(JavaSparkContext.jarOfClass(this.getClass()))
>             .set("spark.default.parallelism", String.valueOf(cpus * 2))
>             .set("spark.executor.memory", "4g")
>             .set("spark.storage.memoryFraction", "0.6")
>             .set("spark.shuffle.memoryFraction", "0.3");
>         JavaSparkContext sc = new JavaSparkContext(conf);
>
> Thanks
> Regards,
> Laurent
>

Reply via email to