Dear friends,

Please send me only suitable candidates profile at *[email protected]*



**

*Job Title:  INFORMTICA ARCHITECT/ LEAD DEVELOPER  WITH SQL (ssrs, ssas,
ssis)
*



*Duration:  6+ Months*



*Location:  San Diego, CA*



*Required Skills: **  *

* *

10+ years of professional experience in Data warehouse Architecture, Design
using INFORMATICA Power Center 8.x/7.x, Ab Initio GDE(1.13), Power Exchange
Navigator, Dimensional modeling like Star schema and Snowflake schema,
Teradata, Netezza, DB2, Oracle 10G/9i, MS SQL Server, Informix, PL/SQL,XML,
Project Management.



Architected the Data Integration solutions for Legacy systems and Data
conversion projects using Informatica ‘Data Stencils’.

Recommended efficient Indexing strategy on large volume data warehouse
tables and improved the ETL and BI Reports throughput effectively.

Involved in creating Logical Data model and Physical data model, ETL
Migration strategy, Data Lineage and defining the data granularity and
surrogate keys creation.

Have good hands-on experience in creating technical design document, ETL
architecture, ETL Process design, ETL logic conversion, Deployment,
Production Data volume estimation.

Have good experience in WebServices, Technical Metadata consolidation,
Defining Project Scope, Milestone based Deliverable Strategies, Database
size estimation and Data Archival Methods.

Have good hands-on experience in designing and developing ETL Processes
using Informatica mappings, Sessions, Event based Tasks, Workflows,
Parameters and Scheduling scripts for ETL processes.

Have good experience in Change Data Capture Processes, Implementing Type 1,
2 SCD tasks.

Designed Data load solutions using External loaders, defining the
‘Distribution Key’ on partitioned Databases.

Designed Data Quality routines using precise SQL (Analytic functions,
Aggregate, In-line views) PL/SQL stored procedures, Packages, Ad-hoc summary
reports using OLAP functions and Oracle Analytical functions.

Have good experience in UNIX Shell scripting to provide effective solutions,
creating Parameter files, Process Log files, Automated Archival of log
files, Scheduling ETL jobs, SQL embedded shell scripts for Data warehouse
environments.

Have good experience in IBM Websphere MQ Series and good knowledge in
Message queue configuration, Reply to Server Queues, Message Types and
exchanging Messages.

Performed Advanced level Data warehouse testing which verifies and validates
the ETL Mappings, Stored Procedures, and OLAP Reports using SAMPLING
techniques.

Designed and developed ‘eDATAGEN’ tool in DB2 SQL for voluminous
production-like data generation for Load Testing phase.

Created Technical Design Documents, High-level ETL Design Document, Data
Model Diagram, Data warehouse Test Strategy documents.

Have good knowledge in Data warehouse Concepts, Teradata BTEQ, Tpump
utilities, SQL Loader, LOAD-DB2, RDBMS Objects, Dimensional Modeling,
Estimate Data Granularity, DWH Methodologies by Ralph Kimball and W.H.Inmon,
Domain experience in HealthCare, Retail, Finance, Banking, Insurance and
Telecom.

Good Team player possessing good inter-personal, strong analytical and
problem solving skills, Team building & Coaching, Client Management skills,
Communication skills, proven leadership qualities and mentored a team of
professionals towards Project Goal.

* *

*Responsibilities:*



Ø  Analyze and Identify the Business & Technical requirements for MIDE
Platform.

Ø  Create the Data mapping specification from Wachovia Systems and Wells
Fargo Home Mortgage systems for Data Analysts, Data Profilers, and Quality
Team members.

Ø  Create ETL Mapping specification Documents to load source data into
Staging, Integrated layers for ETL Team.

Ø  Architect, Design the ETL process to extract the data from disparate
sources and load the data into DWH tables in Teradata Complex.

Ø  Prepare the Gantt chart for Project Plan, Identify tasks, estimate
Timelines, Milestones, Sequential/Parallel Activities, Subject Matter
Experts, Resource Identification & Allocation.

Ø  Identify the Data granularity and Design the Data model for staging,
integrated layer and semantic layer of MIDE.

Ø  Involved in Data modeling Design using TERADATA FS-LDM and provided Data
granularity to accommodate time phased CDC requirements.

Ø  Designed Data Quality Assessment for Source Systems in Mainframe -
Easytrieve, Oracle, Db2, and SQL Server.

Ø  Designed the logical schema and physical schema creation per subject area
in TERADATA.

Ø  Designed Partition Methods, Primary Index key identification, Indexing
strategy for Fact Tables, Disk usage statistics and automated email
notification to Support Team.

Ø  Designed Mapping Templates using ‘Stencils’ and owned critical data
movement mappings, Designed Mapplets, advanced Mappings using Source
Qualifier, Expression, Filter, Look up, Update Strategy,Sorter,Joiner,
Normalizer, Router and Data Masking transformations.

Ø  Responsible for ETL, DB Design walk through, Implementation guide lines,
Code release, delegating administration tasks.

Ø  Design/Develop/Monitor/Schedule Workflows, Sessions, Event based tasks,
parameter files, pre-session/post-session tasks for development/UAT/SIT
environment.

Ø  Determine the ETL,DB access privilege to Team members, Power Exchange
Definition migrations from SIT to PROD environment .

Ø  Create ETL migration strategy document, ETL Metadata Design and Data
cleansing artifacts using Profiling results.

Ø  Developed Advanced BTEQ scripts to create Staging Schema, UAT Schema, SIT
Schema, MLoad, Fload Scripts to load Integrated Layer.

Ø  Created Autosys JIL , Unix shell scripts to FTP source files, validate
source files, automated archival of Log files, create ETL event start/stop
files.

Ø  Performance Tuning on ETL Processes, Teradata Tables, Query Optimization.

Ø  Code Review, Change Request validation and Change request Ticket
creation, Ad-hoc Query packages performance optimization for OLAP reports.

Ø  Recruit Team members, Project Orientation, Tool Upgrades, Evaluation,
Licensing Recommendation, Offshore Team management, Task allocation and
Deliverables Sign-off, Timesheet approval.

Ø  Serve as the liaison between IT and the business sponsors; managing
relationships with the multi-vendor technical teams.

Ø  Project Management, Provided team of professionals with technical
leadership on Data Integration, Industry best practices, Project auditing to
achieve high performance.

-- 
Thanks & Regards,


Kalyan
KSource Inc.
9555, Lebanon Road, Unit # 103
Frisco, Texas - 75035
Phone: 248-458-1322 X 209 | Fax: 248 498 6173
mail:[email protected] <mail%[email protected]>
www.ksourceinc.com
IM:  [email protected]
GTalk:[email protected] <gtalk%[email protected]>

-- 
You received this message because you are subscribed to the Google Groups 
"US_IT.Groups" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/us_itgroups?hl=en.

Reply via email to