I have a similar objective to use maven as our build tool and ran into the
same issue.
The idea is that your config file is actually not found, your fat jar
assembly does not contain the reference.conf resource.

I added the following the <resources> section of my pom to make it work :
<resource>
  <directory>src/main/resources</directory>
    <includes>
      <include>*.conf</include>
    </includes>
  <targetPath>${project.build.directory}/classes</targetPath>
</resource>

I think Paul's gist also achieves a similar effect by specifying a proper
appender in the shading conf.

cheers
François







On Tue, May 13, 2014 at 4:09 AM, Laurent Thoulon <
laurent.thou...@ldmobile.net> wrote:

> (I've never actually received my previous mail so i'm resending it. Sorry
> if it creates a duplicate.)
>
>
> Hi,
>
> I'm quite new to spark (and scala) but has anyone ever successfully
> compiled and run a spark job using java and maven ?
> Packaging seems to go fine but when i try to execute the job using
>
> mvn package
> java -Xmx4g -cp target/jobs-1.4.0.0-jar-with-dependencies.jar
> my.jobs.spark.TestJob
>
> I get the following error
> Exception in thread "main" com.typesafe.config.ConfigException$Missing: No
> configuration setting found for key 'akka.version'
>         at
> com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115)
>         at
> com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136)
>         at
> com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:142)
>         at
> com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
>         at
> com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
>         at
> com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:197)
>         at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:136)
>         at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
>         at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
>         at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
>         at
> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:96)
>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:126)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:139)
>         at
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:47)
>         at my.jobs.spark.TestJob.run(TestJob.java:56)
>
>
> Here's the code right until line 56
>
>         SparkConf conf = new SparkConf()
>             .setMaster("local[" + cpus + "]")
>             .setAppName(this.getClass().getSimpleName())
>             .setSparkHome("/data/spark")
>             .setJars(JavaSparkContext.jarOfClass(this.getClass()))
>             .set("spark.default.parallelism", String.valueOf(cpus * 2))
>             .set("spark.executor.memory", "4g")
>             .set("spark.storage.memoryFraction", "0.6")
>             .set("spark.shuffle.memoryFraction", "0.3");
>         JavaSparkContext sc = new JavaSparkContext(conf);
>
> Thanks
> Regards,
> Laurent
>
>


-- 
François /fly Le Lay
Data Infra Chapter Lead NYC
+1 (646)-656-0075

Reply via email to