Hi Jitendra,

What version of Ambari and HDP are you running?
You just need to install Oozie server on any host, and pick the hosts for the 
clients.
In HDP 2.3, it's possible to have multiple Oozie servers for High Availability.

HDP binaries are in /usr/hdp/current/spark-server/bin
Note that /usr/hdp/current/spark-server is a symlink to 
/usr/hdp/2.#.#.#-####/spark

Thanks,
Alejandro

From: Jeetendra G <jeetendr...@housing.com<mailto:jeetendr...@housing.com>>
Reply-To: "user@ambari.apache.org<mailto:user@ambari.apache.org>" 
<user@ambari.apache.org<mailto:user@ambari.apache.org>>
Date: Thursday, August 27, 2015 at 4:06 AM
To: "user@ambari.apache.org<mailto:user@ambari.apache.org>" 
<user@ambari.apache.org<mailto:user@ambari.apache.org>>
Subject: Running spark and map reduce jobs

Hi All I have installed Ambari and with Ambari I have installed 
hadoop,Spark,hive,oozie.
 When I was installing oozie it was asking me where all you need Ooozie in my 
cluster means in how many Nodes?
I am not really able to understand why its asking what all nodes you want to 
install Oozie. rather it should install in any one Node?


Also how can I run my map reduce and spark jobs?

Where does Ambari installed the binary of the installed packages in /bin?


Regards
Jeetendra

Reply via email to