There is nothing special that you need to do if you are already running
secure Map Reduce jobs. The client needs to run in a Kerberized
authenticated context. After that if you are using the built-in library of
inputs/outputs etc then they should be taking care of all the access
credentials for you when using the 0.5 API. I



If you are using 0.4 API to write your job then you may need to use
additional APIs for passing credentials to the application. Look for
credentials in
https://github.com/apache/tez/blob/branch-0.4.0-incubating/tez-mapreduce-examples/src/main/java/org/apache/tez/mapreduce/examples/FilterLinesByWord.java
and also *public* *synchronized* DAG *addURIsForCredentials(*Collection*<*
URI*>* uris*)*



The second method is a shortcut if you are using HDFS files for input. It
obtains credentials for you from a collection of HDFS input URIs.



Bikas



*From:* Subroto Sanyal [mailto:[email protected]]
*Sent:* Tuesday, August 19, 2014 3:30 AM
*To:* [email protected]
*Subject:* Tez with secured hadoop



hi



Tez works on secure hadoop cluster since tez-0.3.

Is there any documentation available about configuring TezClient to make it
work?



-- 
Cheers,
*Subroto Sanyal*

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Reply via email to