Hi ,

Please let me know if you are interested for this role.

*If interested then please send update resume along with below information.*

*Full Name:         *

*Email:                   *

*Phone:                 *

*Location:             *

*Visa:                      *

*Rate:                     *

*Face to face: *



*Role: Hadoop Administrator*

*Location: Cincinnati, OH*

*Duration: 4-6 months*

*MOI:: T + S*

*Client:: 84.51*

*GC-EAD, GC or USC*



*Hadoop Admin Responsibilities:*

§  Responsible for implementation and ongoing administration of Hadoop
infrastructure.

§  Aligning with the systems engineering team to propose and deploy new
hardware and software environments required for Hadoop and to expand
existing environments.

§  Working with data delivery teams to setup new Hadoop users. This job
includes setting up Linux users, setting up Kerberos principals and testing
HDFS, Hive, Pig and MapReduce access for the new users.

§  Cluster maintenance as well as creation and removal of nodes using tools
like Ganglia,Nagios,Cloudera Manager Enterprise, Dell Open Manage and other
tools.

§  Performance tuning of Hadoop clusters and Hadoop MapReduce routines.

§  Screen Hadoop cluster job performances and capacity planning

§  Monitor Hadoop cluster connectivity and security

§  Manage and review Hadoop log files.

§  File system management and monitoring.

§  HDFS support and maintenance.

§  Diligently teaming with the infrastructure, network, database,
application and business intelligence teams to guaranteehigh data quality
and availability.

§  Collaborating with application teams to install operating system and
Hadoop updates, patches, version upgrades when required.

§  Point of Contact for Vendor escalation



*DBA Responsibilities Performed by Hadoop Administrator:*

§  Data modelling, design & implementation based on recognized standards.

§  Software installation and configuration.

§  Database backup and recovery.

§  Database connectivity and security.

§  Performance monitoring and tuning.

§  Disk space management.

§  Software patches and upgrades.

§  Automate manual tasks.



*Skills Required *

§  Experience with *Kerberos security is a must*

§  *Cloudera – BDA cloud compliance knowledge is required*

§  *Impala, Hive, Ooze* a plus

§  General operational expertise such as good troubleshooting skills,
understanding of *system’s capacity, bottlenecks, basics of memory, CPU,
OS, storage, and networks*.

§  *Hadoop skills like HBase, Hive, Pig, Mahout, etc.*

§  The most essential requirements are: They should be able to deploy
Hadoop cluster, add and remove nodes, keep track of jobs, monitor critical
parts of the cluster, configure name-node high availability, schedule and
configure it and take backups.

§  Good knowledge of *Linux as Hadoop runs on Linux.*

§  Familiarity with open source configuration management and deployment
tools such as Puppet or Chef and Linux scripting.

§  Knowledge of Troubleshooting Core Java Applications is a plus.









*--------*

*Thanks,*

*Bhanu Pratap Singh*

*Source InfoTech Inc.*

W: *609-917-3310*| F: 732-909-2282| Email: *bh...@sourceinfotech.com*
<bh...@sourceinfotech.com> | Website: *www.sourceinfotech.com*
<http://www.sourceinfotech.com/> |

-- 
You received this message because you are subscribed to the Google Groups 
"Oracle-Projects" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to oracle-projects+unsubscr...@googlegroups.com.
To post to this group, send email to oracle-projects@googlegroups.com.
Visit this group at https://groups.google.com/group/oracle-projects.
For more options, visit https://groups.google.com/d/optout.

Reply via email to