it's not downgraded, it's your /etc/alternatives setup that's causing this.
you can update all of those entries by executing the following commands (as
root):
update-alternatives --install "/usr/bin/java" "java"
"/usr/java/latest/bin/java" 1
update-alternatives --install "/usr/bin/javah" "javah"
I don't see any version flag for /usr/bin/jar, but I think I see the
problem now; the openjdk version is 7, but javac -version gives
1.6.0_34; so spark was compiled with java 6 despite the system using
jre 1.7.
Thanks for the sanity check! Now I just need to find out why javac is
downgraded on the
So you mean that the script is checking for this error, and takes it
as a sign that you compiled with java 6.
Your command seems to confirm that reading the assembly jar does fail
on your system though. What version does the jar command show? are you
sure you don't have JRE 7 but JDK 6 installed?
./bin/compute-classpath.sh fails with error:
$> jar -tf
assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop1.0.4.jar
nonexistent/class/path
java.util.zip.ZipException: invalid CEN header (bad signature)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipF