I am not familiar with CDH distribution, we built spark ourselves.
The error means running code generated with Protocol-Buffers 2.5.0 with a
protocol-buffers-2.4.1 (or earlier) jar.
So there is a protocol-buffer 2.4.1 version somewhere, either in the jar you
built, or in the cluster runtime.
Hm... off the cuff I wonder if this is because somehow the build
process ran Maven with Java 6 but forked the Java/Scala compilers and
those used JDK 7. Or some later repackaging process ran on the
artifacts and used Java 6. I do see Build-Jdk: 1.6.0_45 in the
manifest, but I don't think 1.4.x can
On Tue, Aug 25, 2015 at 1:50 PM, Utkarsh Sengar utkarsh2...@gmail.com wrote:
So do I need to manually copy these 2 jars on my spark executors?
Yes. I can think of a way to work around that if you're using YARN,
but not with other cluster managers.
On Tue, Aug 25, 2015 at 10:51 AM, Marcelo
OK, I figured the horrid look alsothe href of all of the styles is
prefixed with the proxy dataso, ultimately if I can fix the proxy
issues with the links, then I can fix the look also
On Tue, Aug 25, 2015 at 5:17 PM, Justin Pihony justin.pih...@gmail.com
wrote:
SUCCESS! I set
Looks like I stuck then, I am using mesos.
Adding these 2 jars to all executors might be a problem for me, I will
probably try to remove the dependency on the otj-logging lib then and just
use log4j.
On Tue, Aug 25, 2015 at 2:15 PM, Marcelo Vanzin van...@cloudera.com wrote:
On Tue, Aug 25, 2015
My local build using rc-4 and java 7 does actually also produce different
binaries (for one file only) than the 1.4.0 releqse artifact available on
Central. These binaries also decompile to identical instructions, but this
may be due to different versions of javac (within the 7 family) producing
SUCCESS! I set SPARK_DNS_HOME=ec2_publicdns, which makes it available to
access the spark ui directly. The application proxy was still getting in
the way by the way it creates the URL, so I manually filled in the
/stage?id=#attempt=# and that workedI'm still having trouble with the
css as the
Hi all,
I'm still not clear what is the best (or, ANY) way to add/subtract
two org.apache.spark.mllib.Vector objects in Scala.
Ok, I understand there was a conscious Spark decision not to support linear
algebra operations in Scala and leave it to the user to choose a linear
algebra library.
From what I have understood, you probably need to convert your vector to
breeze and do your operations there. Check
stackoverflow.com/questions/28232829/addition-of-two-rddmllib-linalg-vectors
On Aug 25, 2015 7:06 PM, Kristina Rogale Plazonic kpl...@gmail.com
wrote:
Hi all,
I'm still not clear
Thanks Ted Yu.
Following are the error message:
1. The exception that is shown on the UI is :
Exception in thread Thread-113 Exception in thread Thread-126 Exception in
thread Thread-64 Exception in thread Thread-90 Exception in thread
Thread-117 Exception in thread Thread-80 Exception in
101 - 110 of 110 matches
Mail list logo