Bronx, NY

Center Light Health Systems

6 Months C2H



*Will consider hiring off a Skype. **F2F may not be required for a
great candidate.*

*MUST HAVE: Talend, ETL, SSIS, and Data Warehouse experience.*



This Senior ETL Developer role (TALEND) will work as a member of one of the
EDW sprint delivery teams. This individual is responsible for closely
working with team members in analyzing data feeds received from the
business partners and data vendors, helping with data flow architect and
other technical solution design, developing and enhancing ETL packages in
Talend, SSIS and SQL stored procedures to integrate data into systems to
best support customers and analytical activities.



Job Responsibilities:

 •Lead technical discussions of ETL workflow design, review, performance
tuning and troubleshooting, and provide solutions to Scrum team lead and
product owner

. •Analyze data feed requirements received from the Business/IT Units, help
in translating business requirements into technical design specifications
and preparing high-level documentation for the database and ETL workflow
design.

 •Develop SQL queries or use available data profiling applications to
perform data analysis based on the business requirement and created data
profiling report for data discovery.

 •Develop ETL sessions and packages to extract, transform and load data
from source to target through multiple stages defined by the system
architectures and mapping documents.

 •Apply pattern checks, business logics and integrate with other ETL
workflows to handle data anomalies in the incoming data files and ensure
data quality and accuracy in the target applications.

•Implement referential integrities in the ETL sessions according to the
logical model and data mapping. Ensure that data validation processes,
error handling and logging are part of any ETL implementations.

•Perform analysis of existing data flow processes and related stored
procedures with emphasis on improving performance.

 •Support the creation of test plans, conduct unit test of ETL mappings,
stored procedures, data validation scripts, assist with regression test to
ensure data integrity within the entire data application systems.

•Create deployment packages, specify job dependencies, participate
deployment review meetings, and assist with code deployment in both QA and
production environment. Provide post release production support as needed.

 •Develop SQL scripts to research data and create ad-hoc reports for
business partners and provide technical support for internal business users



 Technical Experience and Qualification

•Expert in Extraction, Transforming and Loading (ETL) data using Talend. If
the resource has experience with Talend then multiple GUI based data
integration applications such as Informatica, DataStage, and SSIS

 •Experience of creating mappings and workflows to extract and load data
from relational databases, flat file sources, specific healthcare industry
standard file formats, and legacy systems.

•Experience in development, implementation, administration and support of
ETL processes for large-scale data warehouses.

 •Knowledge of multi-dimensional model, experience of implementing slowly
changing dimensions (SCD) and change data capture (CDC) procedures.

 •Experience of using lookups and handle multiple database connections,
able to merge multiple source files/types to single target location

 •Experience in developing SQL/stored procedure, able to write DDL for
tables, views and indexes, and know how to use triggers, functions to
manipulate data. .

 •Experience in data validation, data cleansing, mining, build matching
process and large product reference tables

 •5 year experience with SQL Server Database or other relational databases,
including writing SQL, building tables, developing stored procedures.

 •3+ year development experience using SSIS, Informatica, DataStage, or
other data integration software. Hands on experience with Talend, SQL
Server and related BI applications, AWS or cloud based systems a plus

 • Agile experience is highly preferred.



I have 2 Talend Developers there now and I have copied part of one of their
resumes below for your reference. I hope it helps.


*Sr. Talend/ETL Developer         12/12 - 03/14*
*Responsibilities*:
· As a member of ETL Team Involved in gathering of information, determining
overall ETL architecture, researching the affected data structures,
determining data quality, establishing metrics and developing a
full-lifecycle ETL plan.
· Developed complex *Talend ETL jobs* to migrate the data from flat files
to database.
· Implemented *custom error handling* in *Talend* jobs and also worked on
different methods of logging.
· *Created ETL/Talend* jobs both design and code to process data to target
databases.
· *Created Talend jobs* to load data into various Oracle tables. Utilized
Oracle stored procedures and wrote few *Java code* to capture *globalmap
variables* and use them in the job.
· *Created Talend jobs *to copy the files from one server to another and
utilized *Talend FTP components*.
· Experienced in PERL, Mod Perl, PERL regex and Object oriented PERL.
· Loaded data from SQL server tables to Mainframe using Power Exchange.
· Extracted data from Mainframe databases using Power Exchange and loaded
into SQL Server Database tables.
· Created *Implicit, local and global Context variables* in the job.
· Responsible for creating *fact, lookup*, *dimension*, staging tables and
other database objects like views, stored procedure, function, indexes,
constraints
· Followed the organization defined Naming conventions for naming the Flat
file structure, *Talend Jobs* and daily batches for executing the *Talend*
 Jobs.
· Wrote complex SQL queries to take data from various sources and
integrated it with Talend.
· Worked on *Talend Administration Console (TAC)* for scheduling jobs and
adding users.
· Worked on *Context variables* and defined contexts for database
connections, file paths for easily migrating to different environments in a
project.
· Implemented Error handling in *Talend* to validate the data Integrity and
data completeness for the data from the Flat File
· Created Unix Scripts and run them using *tSSH and tSystem* for reading
the Data from flat files and archiving the Flat files at the specified
server.
· Tuned sources, targets and jobs to improve the performance.
· Monitor; troubleshoot batches and jobs for weekly and monthly extracts
from various data sources across all platforms to the target database.
· Provided the Production Support by running the jobs and fixing the bugs.
*Environment:* Talend Integration Suite 4x, 5x, Oracle 10g/11g, Flat Files,
PL/SQL, UNIX, PERL, Mod Perl , Windows XP and SVN


*Sr. ETL Developer         10/11 - 11/2012*
*Responsibilities*:
· Involved in gathering of business requirements, interacting with business
users and translation of the requirements to ETL High level and Low-level
Design.
· Documented both High level and Low-level design documents, Involved in
the *ETL design* and development of *Data Model*.
· Worked in importing and cleansing of data from various sources like
*Teradata,
Oracle, flat files,* SQL Server 2005 with high volume data.
· Developed complex ETL mappings and worked on the transformations like *Source
qualifier, Joiner, Expression, Sorter, Aggregator, Sequence generator,
Normalizer, Connected Lookup, Unconnected Lookup, Update Strategy and
Stored Procedure transformation*.
· Implemented Slowly Changing Dimension Type 1 and Type 2 for inserting and
updating Target tables for maintaining the history.
· Worked on *BTEQ* scripts, *MLOAD, FASTLOAD, TPUMP, TPT* utilities of
Teradata.
· Worked on *performance tuning* of Teradata database & *Informatica*
*mappings*.
· Involved in various phases of the software development life cycle right
from *Requirements gathering, Analysis, Design, Development, and Testing to
Production*.
· Worked on loading the data from different sources like *Oracle, DB2,
EBCDIC* files (Created *Copy book* layouts for the source files), *ASCII
delimited* flat files to Oracle targets and flat files.
· Implemented change data capture (CDC) using *informatica power exchange* to
load data from clarity DB to Teradata warehouse
· Extracted data from various source systems like *Oracle and flat files* as
per the requirements and loaded it to *Teradata* using *FASTLOAD, TPUMP and
MLOAD*
· Wrote complex SQL queries on *Teradata* and used them in lookup SQL
overrides and Source Qualifier overrides.
· Involved in migration of data from Oracle to *Teradata*.
· Experience in working with *Mapping variables, Mapping parameters,
Workflow variables, implementing SQL scripts and Shell scripts in
Post-Session, Pre-Session* commands in sessions.
· Experience in writing SQL*Loader scripts for preparing the test data in
Development, TEST environment and while fixing production bugs.
· Experience in using the *debugger* to identify the processing
bottlenecks, and performance tuning of Informatica to increase the
performance of the workflows.
· Experience in creating ETL deployment groups and ETL Packages for
promoting up to higher environments.
· Involved in various phases of the software development life cycle right
from *Requirements gathering, Analysis, Design, Development, and Testing to
Production*.
· Performed and documented the *unit testing* for validation of the
mappings against the mapping specifications documents.
· Performed *production support* activities in Data Warehouse (Informatica)
including monitoring and resolving production issues, pursue information,
bug-fixes and supporting end users.
· Experience in writing and implementing *FTP*, *archive* and *purge* Scripts
in UNIX.
*Environment:* Informatica Power Center 9.5/9.1, Oracle 11g, DB2, TOAD 9.0,
UNIX. Teradata R13, PERL and Mod Perl


*Talend Developer         07/10 - 09/11*
*Responsibilities:*
· Coordinated with Business Users to understand business needs and
implement the same into a functional Data warehouse design.
· Converted *functional specifications* into *technical specifications*.
· Developed complex jobs to load data from multiple source systems like
Oracle 10g, COBOL files, flat files, XML files to data mart in Oracle
database.
· Responsible for developing, support and maintenance for the ETL (Extract,
Transform and Load) processes using *Talend Integration Suite*.
· Involved in Unix Shell Scripts for automation of ETL process.
· Created *Talend Development Standards*. This document describes the
general guidelines for Talend developers, the naming conventions to be used
in the Transformations and also development and production environment
structures.
· Used Talend components such as *tmap, tFileExist, tFileCompare,
tELTAggregate, tOracleInput, tOracleOutput* etc.
· Participated in weekly end user meetings to discuss data quality,
performance issues. Ways to improve data accuracy and new requirements, etc.
· Involved in migrating objects from DEV to QA and testing them and then
promoting to Production.
*Environment: *Talend Integration Suite, Oracle 10g, PL/SQL, TOAD, Flat
Files, Windows XP, UNIX.



 *Warm Regards,*

*Randhir Kumar*

*IDC Technologies*

*1851 McCarthy Blvd. Suite 116, Milpitas, CA 95035*

*Email: **randhir.ku...@idctechnologies.com
<randhir.ku...@idctechnologies.com>*

*Phone: *
*408-459-1535 Web: www.idctechnologies.com
<http://www.idctechnologies.com/> *

-- 
You received this message because you are subscribed to the Google Groups "SAP 
or Oracle Financials" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sap-or-oracle-financials+unsubscr...@googlegroups.com.
To post to this group, send email to sap-or-oracle-financials@googlegroups.com.
Visit this group at https://groups.google.com/group/sap-or-oracle-financials.
For more options, visit https://groups.google.com/d/optout.

Reply via email to