*We have an urgent opening for Sr. Hadoop Admin and I have sent you a job description ,Please go through it and let me know if you are comfortable with it and also send me your consultant's updated resume ASAP. .*
*Sr. Hadoop Design, Implementation and Support Admin(Need some Architectural Experience) | Bowie, MD* *Long term Contract* *IN case of H1B need Visa copy* We have immediate openings for technical professionals in the Big Data arena with our direct client in the DC / Maryland area. We develop cutting-edge software solutions that are helping to revolutionize the Informatics industry. We are seeking technical and business professionals with advanced leadership skills to join our tight-knit team in our headquarters located in Maryland. This is an opportunity to work with fellow best-in-class IT professionals to deploy new Business Solutions utilizing the latest technologies for Big Data solutions, including a wide array of Open Source tools This position requires extensive experience on the Hadoop platform using Sqoop, Pig, Hive, and Flume to design, build and support highly scalable data processing pipelines. Hadoop Administrator Responsibilities: Work with Data Architects to plan and deploy new Hadoop environments and expand existing Hadoop clusters. Design Big Data solutions capable of supporting and processing large sets of structured, semi-structured and structured data Provide Administration, management and support for large scale Big Data platforms on Hadoop eco-system. Provide Hadoop cluster capacity planning, maintenance, performance tuning, and trouble shooting. Install, configure, support and manage Hadoop clusters using Apache & Cloudera (CDH3, CDH4), and Yarn distributions. Install and configure Hadoop Ecosystems like Sqoop, Pig, Hive, Flume, and HBase and Hadoop Daemons on the Hadoop cluster Monitor and follow proper backup & recovery strategies for High Availability Configure various property files like core-site.xml, hdfs-site.xml, mapred-site.xml. Monitor multiple Hadoop cluster environments using Ganglia and Nagios, and monitored workload, job performance and capacity using Cloudera Manger Define and schedule all Hadoop/Hive/Sqoop/HBase jobs Import and export data from web servers into HDFS using various tools Required Skills: Extensive experience in Business Intelligence, data warehousing, analytics, and Big Data Experience with Hardware architectural guidance, planning and estimating cluster capacity, and creating roadmaps for Hadoop cluster deployment Expertise in the design, installation, configuration and administration of Hadoop Ecosystems like Sqoop, Pig, Hive, Flume, and HBase and Hadoop Daemons on the Hadoop cluster Working knowledge of capacity planning, performance tuning and optimizing the Hadoop environment. Experience in HDFS data storage and support for running map-reduce jobs. Experience in commissioning, decommissioning, balancing, and managing Nodes on Hadoop Clusters. Experience with Hadoop cluster capacity planning, maintenance, performance tuning, and trouble shooting. Good understanding of Partitioning concepts and different file formats supported in Hive and Pig. Experience in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa. Hands on experience with data analytics tools such as Splunk, Cognos, Tableau etc. *Garima Gupta | Technical Recruiter | Apetan Consulting LLC* *Tel: 201-620-9700* <201-620-9700*> 133 | gar...@apetan.com <gar...@apetan.com> * | garimaapetan...@gmail.com -- You received this message because you are subscribed to the Google Groups "Citrix and Sap problems" group. To unsubscribe from this group and stop receiving emails from it, send an email to citrix-and-sap-problems+unsubscr...@googlegroups.com. To post to this group, send email to citrix-and-sap-problems@googlegroups.com. Visit this group at https://groups.google.com/group/citrix-and-sap-problems. For more options, visit https://groups.google.com/d/optout.