Hey all,

tl;dr; I built Spark with Java 1.8 even though my JAVA_HOME pointed to 1.7.
Then it failed with binary incompatibilities.

I couldn’t find any mention of this in the docs, so It might be a known
thing, but it’s definitely too easy to do the wrong thing.

The problem is that Maven is using the Zinc incremental compiler, which is
a long-running server. If the first build (that spawns the zinc server) is
started with Java 8 on the path, Maven will compile against Java 8 even
after changing JAVA_HOME and rebuilding.

I filed scala-maven-plugin#173
<https://github.com/davidB/scala-maven-plugin/issues/173> but so far no
comment.

Steps to reproduce:

   - make sure zinc is not running yet
   - build with JAVA_HOME pointing to 1.8
   - point JAVA_HOME to 1.7
   - clean build
   - run Spark, watch it fail with NoSuchMethodError in ConcurrentHashMap.
   More details here
   <https://gist.github.com/AlainODea/1375759b8720a3f9f094>

Workaround:

   - build/zinc/bin/zinc -shutdown
   - rebuild

iulian
​
-- 

--
Iulian Dragos

------
Reactive Apps on the JVM
www.typesafe.com

Reply via email to