This could be cause by many things including wrong configuration. Hard
to tell with just the info you provided.

Is there any reason why you want to use your own Spark instead of the
one shipped with CDH? CDH 5.3 has Spark 1.2, so unless you really need
to run Spark 1.1, you should be better off with the CDH version.

On Wed, Jan 7, 2015 at 4:45 PM, freedafeng <freedaf...@yahoo.com> wrote:
> Hi,
>
> I installed the cdh5.3.0 core+Hbase in a new ec2 cluster. Then I manually
> installed spark1.1 in it.  but when I started the slaves, I got an error as
> follows,
>
> ./bin/spark-class org.apache.spark.deploy.worker.Worker spark://master:7077
> Error: Could not find or load main class s.rolling.maxRetainedFiles=3
>
> The spark was compiled against hadoop2.5 + hbase 0.98.6 as in cdh5.3.0.
> Is the error because of some mysterious conflict somewhere? Or I should use
> the spark in cdh5.3.0 for safe?
>
> Thanks!
>
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-1-got-error-when-working-with-cdh5-3-0-standalone-mode-tp21022.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to