!! Need - *Hadoop Administrator - Need Local Candidates* Only !!

Position      :       Hadoop Administrator
Location     :       Detroit, MI
Duration    :       12 Months


*Need Local Candidates Only - Face 2 Face Interview*


*Required Skills and Experience :*

   - 7+ years of experience as an IT professional
   - 5+ years as a Linux administrator
   - 3+ years as a Hadoop administrator
   - Prior experience in a complex, highly integrated services environment
   - Advanced level working knowledge of Red Hat LINUX;
   - Setting the stage for MapReduce - computing daemons and dissecting a
   MapReduce job;
   - Planning for Backup, Recovery and Security - coping with inevitable
   hardware failures and securing your Hadoop cluster;
   - Defining Hadoop Cluster Requirements – Planning, designing and
   architecting a scalable cluster and selecting appropriate hardware;
   - Determining the correct hardware and infrastructure for a multi-node
   cluster;
   - Preparing HDFS - Setting basic configuration parameters, configuring
   block allocation, redundancy and replication;
   - Pre and post installation of Hadoop software and good understanding of
   Hadoop ecosystems dependencies
   - Deploying MapReduce - Installing and setting up the MapReduce
   environment and Delivering redundant load balancing via Rack Awareness;
   - Creating a fault–tolerant file system, isolating single points of
   failure, and maintaining High Availability;
   - Implementing data ingress and egress-Facilitating generic
   input/output, moving bulk data into and out of Hadoop, transmitting HDFS
   data over HTTP with WebHDFS;
   - Expertise in setting up, configuration and management of security for
   Hadoop clusters using Kerberos;
   - Maintaining a Cluster-Employing the standard built–in tools, managing
   and debugging processes using JVM metrics;
   - On-going support for various Hadoop environments - DEMO, TEST, UAT,
   and PROD;
   - Configuring the Scheduler to provide service-level agreements for
   multiple users of a cluster;
   - Database used by Hadoop - backup, security, performance monitoring and
   tuning;
   - Familiarity to deploy Hadoop cluster, add and remove nodes, keep track
   of jobs, monitor critical parts of the cluster
   - Familiarity with Cloudera, Kafka, Spark, data lakes, and Flume is
   desired.





Regards,
Chris Roe - Resource Development Manager
Desk No : 415-251-3968 | Email : ch...@itbtalent.com

-- 
You received this message because you are subscribed to the Google Groups 
"oraapps" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to oraapps+unsubscr...@googlegroups.com.
To post to this group, send email to oraapps@googlegroups.com.
Visit this group at https://groups.google.com/group/oraapps.
For more options, visit https://groups.google.com/d/optout.

Reply via email to