- *Job title: Hadoop Developer*
*Location: Buffalo or Schenectady, NY* *Duration:6-12 months+ (With extentions)* *Job Description* • Responsible for designing the data intake into the Hadoop and make it available to business in query able format • Responsible for implementation, ongoing technical support of Hadoop eco-system (including access, incident and problem management)- Provide technical leadership and collaborate with developers and architects for implementations on the Hadoop Platform. • Design and Configuration of the Hadoop Platform and various associated components (including 3rd party tools) for data ingestion, transformation, migration, processing, and reporting. • Responsible to work with the infrastructure, network, database, application, and business intelligence teams to achieve high data quality, performance, availability and security of the platform. • Mentor other Hadoop developers and administrators • Responsible for documenting the design and processes in implementation for ongoing support • Assist to optimize and integrate new infrastructure via continuous integration methodologies. • Setup and maintain CI/CD application server environments and pipelines with tools and technologies like Docker, Jenkins and Kubernetes. • Follow company policies, procedures, controls, and processes for the job. • In addition to the above key responsibilities, you may be required to undertake other duties from time to time as the Company may reasonably require. *Required Skills:* • A minimum of bachelor's degree in computer science or equivalent. • Experience working on any Hadoop distribution, such as Cloudera/Hortonworks and have at least coded in Apache Hadoop, Spark, Kafka, Hive, Pig, Drill for 6 years (or) more • Experience with Data lineage, Data Tagging following data driven security model • Experience in NiFi, Spark streaming, Elastic Search, Tensorflow, Pytorch • Strong knowledge of relational databases (Oracle, SQL Server , Postgres) and Expert in SQL language • Experience with languages such as Python, Go and Java is required. • Experience with Agile, DevOps and GITOps automation. • Proficient in utilizing cloud computing virtualization technologies, storage architecture & AWS/Azure technologies • Knowledge of working with various Hadoop connectors • Healthcare knowledge is an advantage • Must have experience with source control tools • Must have strong problem-solving and analytical skills • Must have the ability to identify complex problems and review related information to develop and evaluate options and implement solutions. • Kubernetes Containerization experience *Regards* *Anudeep| Sr. Manager- Recruitment's 770-838-3849 | Email: anud...@cysphere.net <anud...@cysphere.net>* *CYBER SPHERE LLC* *An E-Verify Company!!!!!!* *Website: www.cysphere.net <http://r20.rs6.net/tn.jsp?f=001ldHSH3N6aSfL5foeFpZowJwipNKZU5Jt3QBkrikAm3j7_50y1UxcAF0Jw-1heZ-AqU8Cfx-XA1oiB7-Gom7bRTS_uHyJzznvALlfDNZez8diD2z5xCPXjHLW8FD0R1Gj49yL-I15bfrk90Nz3rZFrw==&c=atpnekxiMt1PqxfnMag9n0po258FEO547eRrb8KcBHirPtNVcS4PRg==&ch=_EvOgqBd15ubNcXnbAtvjkEbo2tRbfAWLe08Ko1jbjNghSM3YF30eg==>* <https://www.avast.com/en-in/recommend?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail&utm_term=default3&tag=0423851b-e7b0-4cf2-994e-5ee6eb47dd19> I’m protected online with Avast Free Antivirus. Get it here — it’s free forever. <https://www.avast.com/en-in/recommend?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail&utm_term=default3&tag=0423851b-e7b0-4cf2-994e-5ee6eb47dd19> -- You received this message because you are subscribed to the Google Groups "CorptoCorp" group. To unsubscribe from this group and stop receiving emails from it, send an email to corptocorp+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/corptocorp/CAJOb5BZAqSKg59BbwmOchU6%2ByQ94Swq%3DjvBr1Xh-W5BWDve%3Drw%40mail.gmail.com.