*Hadoop Administrator*

*Location - Charlotte, NC *

*Duration - 8 months *

*Rate - $50/hr on c2c all incl.*

*Only GC, EAD GC and USC*



Description -

looking for an energetic, high-performing System Engineer to join the big
data team. This position is accountable for delivering the infrastructure
solutions of assigned big data applications throughout the complete use
case lifecycle. The responsibilities include: lead the design and
engineering of the infrastructure solutions; accountable for the
implementation and production roll out of the solutions and train the
production staff for steady state support. The infrastructure solutions
delivered need to be resilient, scalable, secured and with high-performance
that meet all the functional and non-functional requirements. You will be
working hand in hand with our principle hadoop architect and top-notch
hadoop developers, data scientists and data engineers to implement new
solutions such as kafka and storm to realize our big data vision.



Minimum Qualifications:

• 5+ years of solution architecture and/or hands on OS system
administration

• Strong ability to drive complex technical solutions deployed at an
enterprise level; ability to drive big data technology adoption and changes
through education and partnership with stakeholders

• Negotiate, resolve, and prioritize complex issues and provide
explanations and information to others on difficult issues

• Estimate and organize own work to meet or negotiate deadlines - lead /
facilitate the creation of estimates

• Self-starter who can work with minimal guidance but strong communication.

• Demonstrated experience in architecture, engineering and implementation
of enterprise-grade production config and setup.

• Knowledge of Hadoop Ecosystem

• In depth knowledge of java



Preferred Qualifications:

• Expert level knowledge and experience in Kafka, Storm and Flume

• Extensive knowledge about Hadoop Architecture and HDFS

• Hands on experience in MapReduce, hive, pig, java, HBase, Solr, and the
following hadoop eco-system products: Sqoop, Oozie, and/or Spark

• hands on delivery experience working on popular Hadoop distribution
platforms like Cloudera, HortonWorks or MapR

• Hands on experience in architectural design and solution implementation
of large scale Big Data use cases

• Understanding of industry patterns for big data solutions

• Strong awareness and understanding of emerging trends and technologies in
hadoop and finance industry

• Demonstrated experience in working with the vendor(s) and user
communities to research and testing new technologies to enhance the
technical capabilities of existing hadoop cluster

• Demonstrated experience in working with hadoop architect and big data
users to implement new hadoop eco-system technologies to support
multi-tenancy cluster.

• Understanding of NoSQL technologies

• Shell Scripting, Python, Java and/or C/C++ programming experience

• Bachelor’s degree in Computer Science or related technological degree.

• Certifications in system administration or Hadoop



*Thanks and Regards*

*Shweta Ojha*

Sr. Technical Recruiter | Sage Group Consulting

*Direct: *732-784-6492

*Email: *so...@sagetl.com

*Gtalk: *sojha1290*|Ymail: *ojha1290

-- 
You received this message because you are subscribed to the Google Groups 
"USITCV" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to usitcv+unsubscr...@googlegroups.com.
To post to this group, send email to usitcv@googlegroups.com.
Visit this group at http://groups.google.com/group/usitcv.
For more options, visit https://groups.google.com/d/optout.

Reply via email to