Greeting from Computech Corporation,

Position: Sr ETL Consultant With Hadoop Experience
Location: Irvine, CA
Duration: 6 Months  through the end of the year with potential to extend

Client: Teradata Corporation


Please send me candidates should have exp in ETL AND Hadoop


Job Descriptions:

The individual must be capable of understanding Hadoop ecosystem and the
complex object design and the underlying data model of the system. The
individual must be comfortable with developing data-centric applications
using Hadoop tools, Netezza, Informatica, Informatica BDE (Big Data
Edition), HIVE MapReduce, Spark and able to develop ETL packages, and is
expected to develop queries and stored procedures involving complex
database structures and Hadoop Distributed File System(HDFS). The
individual must have excellent communication skills, work well in a team
environment, enjoy solving complex problems and be able to work in a fast
paced environment.

The Responsibilities:
• Define technical scope and objectives through research and participation
in requirements-gathering and definition of processes
• Gather and process raw, structured, semi-structured, and unstructured
data at scale, including writing scripts, developing programmatic
interfaces against web APIs, Web logs, processing real time feeds, etc.
• Design, review, implement and optimize data transformation processes in
the Hadoop (primary) and Informatica ecosystems
• Test and prototype new data integration tools, techniques and
methodologies
• Adhere to all applicable development policies, procedures and standards
• Participate in functional test planning and testing for the assigned
application integrations, functional areas and projects.
• Work with the team in an Agile/SCRUM environment to ensure a quality
product is delivered
• Rapid response and cross-functional work to deliver appropriate
resolution of technical, procedural, and operational issues.

Qualifications
Required Skills and Experience:
• A BS degree in Computer Science, related technical field, or equivalent
work experience; Masters preferred.
Minimum of three years’ experience with the following:
• Experience architecting and integrating the Hadoop platform with
traditional RDBMS data warehouses.
• Experience with major Hadoop distributions like Cloudera (preferred),
HortonWorks, MapR, BigInsights, or Amazon EMR is essential.
• Experience with ETL tools such as Informatica
• Experience developing within the Hadoop platform including Java
MapReduce, Hive, Pig, and Pig UDF development.
• Excellent oral and written communication skills
• Excellent customer service skills
• Excellent analytical and problem solving skills
• Working knowledge of Linux O/S environments

Preferred Skills & Experience:
• Experience with logical, 3NF or Dimensional data models.
• Experience with NoSQL databases like HBase, Cassandra, Redis and MongoDB.
• Experience with Hadoop ecosystem technologies like Flume, Kafka and
Spark.
• Experience with Netezza and Oracle.
• Experience with Informatica Big Data Edition.
• Certifications from Cloudera, HortonWorks and/or MapR.
• Knowledge of Java SE, Java EE, JMS, XML, XSL, Web Services and other
application integration related technologies
• Familiarity with Business Intelligence tools and platforms like Datameer,
Platfora, Tableau and Microstrategy a plus.
• Experience in working in an Agile/SCRUM model.

Thanks and Regards,

Anand V
Computech Corporation
100 W Kirby St
Detroit, MI-48202
Phone: ( 313) 202-0929 <callto:313%29%20202-0929> Ext 51168
Email: anand.v...@computechcorp.com

-- 
You received this message because you are subscribed to the Google Groups 
"International SAP Projects" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to international-sap-projects+unsubscr...@googlegroups.com.
To post to this group, send email to 
international-sap-projects@googlegroups.com.
Visit this group at https://groups.google.com/group/international-sap-projects.
For more options, visit https://groups.google.com/d/optout.

Reply via email to