*Job Title: *Datawarehouse Engineer

*Location: *Irvine, CA

*Duration: *6 Months +





*The Responsibilities*:

•             Define technical scope and objectives through research and
participation in requirements-gathering and definition of processes

•             Gather and process raw, structured, semi-structured, and
unstructured data at scale, including writing scripts, developing
programmatic interfaces against web APIs, scraping web pages, processing
twitter feeds, etc.

•             Design, review, implement and optimize data transformation
processes in the Hadoop (primary) and Informatica ecosystems

•             Test and prototype new data integration tools, techniques and
methodologies

•             Adhere to all applicable development policies, procedures and
standards

•             Participate in functional test planning and testing for the
assigned application integrations, functional areas and projects.

•             Work with the team in an Agile/SCRUM environment to ensure a
quality product is delivered

•             Rapid response and cross-functional work to deliver
appropriate resolution of technical, procedural, and operational issues.



*Qualifications*

*Required Skills and Experience:*

•             A BS degree in Computer Science, related technical field, or
equivalent work experience; Masters preferred.



*Minimum of one year experience with the following: *



•             Experience architecting and integrating the Hadoop platform
with traditional RDBMS data warehouses.

*•             Experience with major Hadoop distributions like Cloudera
(preferred), HortonWorks, MapR, BigInsights, or Amazon EMR is essential.  *

*•             Experience developing within the Hadoop platform including
Java MapReduce, Hive, Pig, and Pig UDF development.*

•             Excellent oral and written communication skills

•             Excellent customer service skills

•             Excellent analytical and problem solving skills

•             Working knowledge of Linux O/S environment



*Preferred Skills & Experience: *

•             Experience with logical, 3NF or Dimensional data models.

•             Experience with data quality tools such as First Logic.

•             Experience with NoSQL databases like HBase, Cassandra, Redis
and MongoDB.

•             Experience with Hadoop ecosystem technologies like Flume.

•             Experience with Netezza and Oracle.

•             Experience with Informatica PowerCenter and Informatica Big
Data Edition.

•             Certifications from Cloudera, HortonWorks and/or MapR.

•             Knowledge of Java SE, Java EE, JMS, XML, XSL, Web Services
and other application integration related technologies

•             Familiarity with Business Intelligence tools and platforms
like Datameer, Platfora, Tableau and Microstrategy a plus.

•             Experience in working in an Agile/SCRUM model.





*Warm Regards,*

*Rohan Kumar*

Pyramid Consulting, Inc.

*O: *770-255-3278

E: rohan.ku...@pyramidci.com

-- 
You received this message because you are subscribed to the Google Groups "SAP 
BASIS" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sap-basis+unsubscr...@googlegroups.com.
To post to this group, send email to sap-basis@googlegroups.com.
Visit this group at http://groups.google.com/group/sap-basis.
For more options, visit https://groups.google.com/d/optout.

Reply via email to