You can choose to set "livy.spark.master" to "local" and
"livy.spark.deploy-mode" to "client" to start Spark with local mode, in
such case YARN is not required.

Otherwise if you plan to run on YARN, you have to install Hadoop and
configure HADOOP_CONF_DIR in livy-env.sh.

On Thu, Oct 26, 2017 at 9:40 PM, Stefan Miklosovic <mikloso...@gmail.com>
wrote:

> Hi,
>
> I am running Livy server in connection with Spark without Hadoop. I am
> setting only SPARK_HOME and I am getting this in Livy UI logs after
> job submission.
>
> I am using pretty much standard configuration but
> livy.spark.deploy-mode = cluster
>
> Do I need to run with Hadoop installation as well and specify
> HADOOP_CONF_DIR?
>
> Is not it possible to run Livy with "plain" Spark without YARN?
>
> stderr:
> java.lang.ClassNotFoundException:
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:348)
> at org.apache.spark.util.Utils$.classForName(Utils.scala:230)
> at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$
> deploy$SparkSubmit$$runMain(SparkSubmit.scala:712)
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> Thanks!
>
> --
> Stefan Miklosovic
>

Reply via email to