Hi, Please check and let me know
Kindly Mail me at : abhis...@apetan.com *Hadoop Developer* *Raleigh, NC* *( GC or USC or GCEAD * *Mode: Phone and Skype* *Duration: 6+* One last shot for you to fill my reqs - *I need 2 candidates with:* - *No less than 10 years of experience in IT* - *8+ years of hands-on Java development,* - *3+ years of Hadoop development* - *The must have current experience with Hbase, Spark and Python.* Candidate needs to have 8+ years of strong Java experience. Need to have experience leading teams, if worked in the role of Solution Architect would be desirable. Need strong expertise with Hadoop Job Responsibilities: - Establish framework for big data including infrastructure setup, HDFS cluster design, organizing structured and un-structured data within HDFS and writing scripts to read / write data from email servers, relational databases into HDFS. - Review and analyze complex process, system and / or data requirements and specifications - Serve as the technical subject matter expert for systems - Serve as the primary designer for complex component designs for systems Build, test, deploy, and document complex software components for systems - Install and configure Hadoop Products - Build distributed, scalable, and reliable data pipelines that ingest and process data at scale and in real-time. - Collaborate with other teams to design and develop data tools that support both operations and product use cases. - Source huge volume of data from diversified data platforms into Hadoop platform - Perform offline analysis of large data sets using components from the Hadoop ecosystem. - Evaluate big data technologies and prototype solutions to improve our data processing architecture. - Java foundation and background is improtant Candidate Profile: - 8+ years of hands-on programming experience with 3+ years in Hadoop platform - Experience in building 'big data' processing framework - Proficiency with Java and one of the scripting languages like Python / Scala etc. - Experience of working with Hadoop, Hive, Pig - Flair for data, schema, data model, how to bring efficiency in big data related life cycle - Ability to acquire, compute, store and provision various types of datasets in Hadoop platform - Understanding of various Visualization platforms (Tableau, Qlikview, others) - Strong object-oriented design and analysis skills - Excellent technical and organizational skills - Excellent written and verbal communication skills Top skill sets / technologies: - Java / Python / Scala - Sqoop/Flume/Kafka/Pig/Hive/(Talend or Pentaho or Informatica or similar ETL) / HBase / NoSQL / MapReduce/Spark - Data Integration/Data Management/Data Visualization experience Skills: Strong core java skills, MapReduce programming, Hive QL, HBase, Cassandra, Pig, REST services programming, Familiar with Cloudera and Hortonworks. Should have worked on a production quality projects Thanks Abhishek abhis...@apetan.com Apetan Consulting LLC -- You received this message because you are subscribed to the Google Groups "Vendors" group. To unsubscribe from this group and stop receiving emails from it, send an email to vendors+unsubscr...@googlegroups.com. To post to this group, send email to vendors@googlegroups.com. Visit this group at https://groups.google.com/group/vendors. For more options, visit https://groups.google.com/d/optout.