Hello,
So I was unable to run the following commands from the spark shell with CDH
5.0 and spark 0.9.0, see below.
Once I removed the property
property
nameio.compression.codec.lzo.class/name
valuecom.hadoop.compression.lzo.LzoCodec/value
finaltrue/final
/property
from the core-site.xml on the
Hello,
Currently I deployed 0.9.1 spark using a new way of starting up spark
exec start-stop-daemon --start --pidfile /var/run/spark.pid
--make-pidfile --chuid ${SPARK_USER}:${SPARK_GROUP} --chdir ${SPARK_HOME}
--exec /usr/bin/java -- -cp ${CLASSPATH}
I am a dork please disregard this issue. I did not have the slaves
correctly configured. This error is very misleading
On Tue, Apr 15, 2014 at 11:21 AM, Paul Schooss paulmscho...@gmail.comwrote:
Hello,
Currently I deployed 0.9.1 spark using a new way of starting up spark
exec start
Has anyone got this working? I have enabled the properties for it in the
metrics.conf file and ensure that it is placed under spark's home
directory. Any ideas why I don't see spark beans ?