Didn't really edit the configs as much .. but here's what the spark-env.sh
is:
#!/usr/bin/env bash
##
# Generated by Cloudera Manager and should not be modified directly
##
export SPARK_HOME=/opt/cloudera/parcels/CDH-5.2.0-1.cdh5.2.0.p0.36/lib/spark
export
Question ... when you mean different versions, different versions of
dependency files? what are the dependency files for spark?
On Tue Nov 18 2014 at 5:27:18 PM Anson Abraham anson.abra...@gmail.com
wrote:
when cdh cluster was running, i did not set up spark role. When I did for
the first
Hi Anson,
We've seen this error when incompatible classes are used in the driver
and executors (e.g., same class name, but the classes are different
and thus the serialized data is different). This can happen for
example if you're including some 3rd party libraries in your app's
jar, or changing
yeah but in this case i'm not building any files. just deployed out config
files in CDH5.2 and initiated a spark-shell to just read and output a file.
On Wed Nov 19 2014 at 4:52:51 PM Marcelo Vanzin van...@cloudera.com wrote:
Hi Anson,
We've seen this error when incompatible classes are used
On Wed, Nov 19, 2014 at 2:13 PM, Anson Abraham anson.abra...@gmail.com wrote:
yeah but in this case i'm not building any files. just deployed out config
files in CDH5.2 and initiated a spark-shell to just read and output a file.
In that case it is a little bit weird. Just to be sure, you are
yeah CDH distribution (1.1).
On Wed Nov 19 2014 at 5:29:39 PM Marcelo Vanzin van...@cloudera.com wrote:
On Wed, Nov 19, 2014 at 2:13 PM, Anson Abraham anson.abra...@gmail.com
wrote:
yeah but in this case i'm not building any files. just deployed out
config
files in CDH5.2 and initiated a
Sorry meant cdh 5.2 w/ spark 1.1.
On Wed, Nov 19, 2014, 17:41 Anson Abraham anson.abra...@gmail.com wrote:
yeah CDH distribution (1.1).
On Wed Nov 19 2014 at 5:29:39 PM Marcelo Vanzin van...@cloudera.com
wrote:
On Wed, Nov 19, 2014 at 2:13 PM, Anson Abraham anson.abra...@gmail.com
wrote:
As Marcelo mentioned, the issue occurs mostly when incompatible classes are
used by executors or drivers. Try out if the output is coming on
spark-shell. If yes, then most probably in your case, there might be some
issue with your configuration files. It will be helpful if you can paste
the
I'm essentially loading a file and saving output to another location:
val source = sc.textFile(/tmp/testfile.txt)
source.saveAsTextFile(/tmp/testsparkoutput)
when i do so, i'm hitting this error:
14/11/18 21:15:08 INFO DAGScheduler: Failed to run saveAsTextFile at
console:15
It can be a serialization issue.
Happens when there are different versions installed on the same system.
What do you mean by the first time you installed and tested it out?
On Wed, Nov 19, 2014 at 3:29 AM, Anson Abraham anson.abra...@gmail.com
wrote:
I'm essentially loading a file and saving
10 matches
Mail list logo