Re: [ERROR] bin/compute-classpath.sh: fails with false positive test for java 1.7 vs 1.6

2015-02-24 Thread shane knapp
it's not downgraded, it's your /etc/alternatives setup that's causing this.

you can update all of those entries by executing the following commands (as
root):

update-alternatives --install "/usr/bin/java" "java"
"/usr/java/latest/bin/java" 1
update-alternatives --install "/usr/bin/javah" "javah"
"/usr/java/latest/bin/javah" 1
update-alternatives --install "/usr/bin/javac" "javac"
"/usr/java/latest/bin/javac" 1
update-alternatives --install "/usr/bin/jar" "jar"
"/usr/java/latest/bin/jar" 1

(i have the latest jdk installed in /usr/java/ with a /usr/java/latest/
symlink pointing to said jdk's dir)

On Tue, Feb 24, 2015 at 3:32 PM, Mike Hynes <91m...@gmail.com> wrote:
>
> I don't see any version flag for /usr/bin/jar, but I think I see the
> problem now; the openjdk version is 7, but javac -version gives
> 1.6.0_34; so spark was compiled with java 6 despite the system using
> jre 1.7.
> Thanks for the sanity check! Now I just need to find out why javac is
> downgraded on the system..
>
> On 2/24/15, Sean Owen  wrote:
> > So you mean that the script is checking for this error, and takes it
> > as a sign that you compiled with java 6.
> >
> > Your command seems to confirm that reading the assembly jar does fail
> > on your system though. What version does the jar command show? are you
> > sure you don't have JRE 7 but JDK 6 installed?
> >
> > On Tue, Feb 24, 2015 at 11:02 PM, Mike Hynes <91m...@gmail.com> wrote:
> >> ./bin/compute-classpath.sh fails with error:
> >>
> >> gt; jar -tf
> >>
assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop1.0.4.jar
> >> nonexistent/class/path
> >> java.util.zip.ZipException: invalid CEN header (bad signature)
> >> at java.util.zip.ZipFile.open(Native Method)
> >> at java.util.zip.ZipFile.(ZipFile.java:132)
> >> at java.util.zip.ZipFile.(ZipFile.java:93)
> >> at sun.tools.jar.Main.list(Main.java:997)
> >> at sun.tools.jar.Main.run(Main.java:242)
> >> at sun.tools.jar.Main.main(Main.java:1167)
> >>
> >> However, I both compiled the distribution and am running spark with
Java
> >> 1.7;
> >> $ java -version
> >> java version "1.7.0_75"
> >> OpenJDK Runtime Environment (IcedTea 2.5.4)
> >> (7u75-2.5.4-1~trusty1)
> >> OpenJDK 64-Bit Server VM (build 24.75-b04, mixed mode)
> >> on a system running Ubuntu:
> >> $ uname -srpov
> >> Linux 3.13.0-44-generic #73-Ubuntu SMP Tue Dec 16 00:22:43 UTC 2014
> >> x86_64 GNU/Linux
> >> $ uname -srpo
> >> Linux 3.13.0-44-generic x86_64 GNU/Linux
> >>
> >> This problem was reproduced on Arch Linux:
> >>
> >> $ uname -srpo
> >> Linux 3.18.5-1-ARCH x86_64 GNU/Linux
> >> with
> >> $ java -version
> >> java version "1.7.0_75"
> >> OpenJDK Runtime Environment (IcedTea 2.5.4) (Arch Linux build
> >> 7.u75_2.5.4-1-x86_64)
> >> OpenJDK 64-Bit Server VM (build 24.75-b04, mixed mode)
> >>
> >> In both of these cases, the problem is not the java versioning;
> >> neither system even has a java 6 installation. This seems like a false
> >> positive to me in compute-classpath.sh.
> >>
> >> When I comment out the relevant lines in compute-classpath.sh, the
> >> scripts start-{master,slaves,...}.sh all run fine, and I have no
> >> problem launching applications.
> >>
> >> Could someone please offer some insight into this issue?
> >>
> >> Thanks,
> >> Mike
> >>
> >> -
> >> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> >> For additional commands, e-mail: dev-h...@spark.apache.org
> >>
> >
>
>
> --
> Thanks,
> Mike
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>


Re: [ERROR] bin/compute-classpath.sh: fails with false positive test for java 1.7 vs 1.6

2015-02-24 Thread Mike Hynes
I don't see any version flag for /usr/bin/jar, but I think I see the
problem now; the openjdk version is 7, but javac -version gives
1.6.0_34; so spark was compiled with java 6 despite the system using
jre 1.7.
Thanks for the sanity check! Now I just need to find out why javac is
downgraded on the system..

On 2/24/15, Sean Owen  wrote:
> So you mean that the script is checking for this error, and takes it
> as a sign that you compiled with java 6.
>
> Your command seems to confirm that reading the assembly jar does fail
> on your system though. What version does the jar command show? are you
> sure you don't have JRE 7 but JDK 6 installed?
>
> On Tue, Feb 24, 2015 at 11:02 PM, Mike Hynes <91m...@gmail.com> wrote:
>> ./bin/compute-classpath.sh fails with error:
>>
>> $> jar -tf
>> assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop1.0.4.jar
>> nonexistent/class/path
>> java.util.zip.ZipException: invalid CEN header (bad signature)
>> at java.util.zip.ZipFile.open(Native Method)
>> at java.util.zip.ZipFile.(ZipFile.java:132)
>> at java.util.zip.ZipFile.(ZipFile.java:93)
>> at sun.tools.jar.Main.list(Main.java:997)
>> at sun.tools.jar.Main.run(Main.java:242)
>> at sun.tools.jar.Main.main(Main.java:1167)
>>
>> However, I both compiled the distribution and am running spark with Java
>> 1.7;
>> $ java -version
>> java version "1.7.0_75"
>> OpenJDK Runtime Environment (IcedTea 2.5.4)
>> (7u75-2.5.4-1~trusty1)
>> OpenJDK 64-Bit Server VM (build 24.75-b04, mixed mode)
>> on a system running Ubuntu:
>> $ uname -srpov
>> Linux 3.13.0-44-generic #73-Ubuntu SMP Tue Dec 16 00:22:43 UTC 2014
>> x86_64 GNU/Linux
>> $ uname -srpo
>> Linux 3.13.0-44-generic x86_64 GNU/Linux
>>
>> This problem was reproduced on Arch Linux:
>>
>> $ uname -srpo
>> Linux 3.18.5-1-ARCH x86_64 GNU/Linux
>> with
>> $ java -version
>> java version "1.7.0_75"
>> OpenJDK Runtime Environment (IcedTea 2.5.4) (Arch Linux build
>> 7.u75_2.5.4-1-x86_64)
>> OpenJDK 64-Bit Server VM (build 24.75-b04, mixed mode)
>>
>> In both of these cases, the problem is not the java versioning;
>> neither system even has a java 6 installation. This seems like a false
>> positive to me in compute-classpath.sh.
>>
>> When I comment out the relevant lines in compute-classpath.sh, the
>> scripts start-{master,slaves,...}.sh all run fine, and I have no
>> problem launching applications.
>>
>> Could someone please offer some insight into this issue?
>>
>> Thanks,
>> Mike
>>
>> -
>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> For additional commands, e-mail: dev-h...@spark.apache.org
>>
>


-- 
Thanks,
Mike

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: [ERROR] bin/compute-classpath.sh: fails with false positive test for java 1.7 vs 1.6

2015-02-24 Thread Sean Owen
So you mean that the script is checking for this error, and takes it
as a sign that you compiled with java 6.

Your command seems to confirm that reading the assembly jar does fail
on your system though. What version does the jar command show? are you
sure you don't have JRE 7 but JDK 6 installed?

On Tue, Feb 24, 2015 at 11:02 PM, Mike Hynes <91m...@gmail.com> wrote:
> ./bin/compute-classpath.sh fails with error:
>
> $> jar -tf 
> assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop1.0.4.jar
> nonexistent/class/path
> java.util.zip.ZipException: invalid CEN header (bad signature)
> at java.util.zip.ZipFile.open(Native Method)
> at java.util.zip.ZipFile.(ZipFile.java:132)
> at java.util.zip.ZipFile.(ZipFile.java:93)
> at sun.tools.jar.Main.list(Main.java:997)
> at sun.tools.jar.Main.run(Main.java:242)
> at sun.tools.jar.Main.main(Main.java:1167)
>
> However, I both compiled the distribution and am running spark with Java 1.7;
> $ java -version
> java version "1.7.0_75"
> OpenJDK Runtime Environment (IcedTea 2.5.4) (7u75-2.5.4-1~trusty1)
> OpenJDK 64-Bit Server VM (build 24.75-b04, mixed mode)
> on a system running Ubuntu:
> $ uname -srpov
> Linux 3.13.0-44-generic #73-Ubuntu SMP Tue Dec 16 00:22:43 UTC 2014
> x86_64 GNU/Linux
> $ uname -srpo
> Linux 3.13.0-44-generic x86_64 GNU/Linux
>
> This problem was reproduced on Arch Linux:
>
> $ uname -srpo
> Linux 3.18.5-1-ARCH x86_64 GNU/Linux
> with
> $ java -version
> java version "1.7.0_75"
> OpenJDK Runtime Environment (IcedTea 2.5.4) (Arch Linux build
> 7.u75_2.5.4-1-x86_64)
> OpenJDK 64-Bit Server VM (build 24.75-b04, mixed mode)
>
> In both of these cases, the problem is not the java versioning;
> neither system even has a java 6 installation. This seems like a false
> positive to me in compute-classpath.sh.
>
> When I comment out the relevant lines in compute-classpath.sh, the
> scripts start-{master,slaves,...}.sh all run fine, and I have no
> problem launching applications.
>
> Could someone please offer some insight into this issue?
>
> Thanks,
> Mike
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



[ERROR] bin/compute-classpath.sh: fails with false positive test for java 1.7 vs 1.6

2015-02-24 Thread Mike Hynes
./bin/compute-classpath.sh fails with error:

$> jar -tf 
assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop1.0.4.jar
nonexistent/class/path
java.util.zip.ZipException: invalid CEN header (bad signature)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.(ZipFile.java:132)
at java.util.zip.ZipFile.(ZipFile.java:93)
at sun.tools.jar.Main.list(Main.java:997)
at sun.tools.jar.Main.run(Main.java:242)
at sun.tools.jar.Main.main(Main.java:1167)

However, I both compiled the distribution and am running spark with Java 1.7;
$ java -version
java version "1.7.0_75"
OpenJDK Runtime Environment (IcedTea 2.5.4) (7u75-2.5.4-1~trusty1)
OpenJDK 64-Bit Server VM (build 24.75-b04, mixed mode)
on a system running Ubuntu:
$ uname -srpov
Linux 3.13.0-44-generic #73-Ubuntu SMP Tue Dec 16 00:22:43 UTC 2014
x86_64 GNU/Linux
$ uname -srpo
Linux 3.13.0-44-generic x86_64 GNU/Linux

This problem was reproduced on Arch Linux:

$ uname -srpo
Linux 3.18.5-1-ARCH x86_64 GNU/Linux
with
$ java -version
java version "1.7.0_75"
OpenJDK Runtime Environment (IcedTea 2.5.4) (Arch Linux build
7.u75_2.5.4-1-x86_64)
OpenJDK 64-Bit Server VM (build 24.75-b04, mixed mode)

In both of these cases, the problem is not the java versioning;
neither system even has a java 6 installation. This seems like a false
positive to me in compute-classpath.sh.

When I comment out the relevant lines in compute-classpath.sh, the
scripts start-{master,slaves,...}.sh all run fine, and I have no
problem launching applications.

Could someone please offer some insight into this issue?

Thanks,
Mike

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org