Send resume to mah...@acmht.com
Start Date : Immediate. • 10+ years of professional experience in Data warehouse Architecture, Design using INFORMATICA Power Center 8.x/7.x, Ab Initio GDE(1.13), Power Exchange Navigator, Dimensional modeling like Star schema and Snowflake schema, Teradata, Netezza, DB2, Oracle 10G/9i, MS SQL Server, Informix, PL/SQL,XML, Project Management. • • Architected the Data Integration solutions for Legacy systems and Data conversion projects using Informatica ‘Data Stencils’. • • Recommended efficient Indexing strategy on large volume data warehouse tables and improved the ETL and BI Reports throughput effectively. • Involved in creating Logical Data model and Physical data model, ETL Migration strategy, Data Lineage and defining the data granularity and surrogate keys creation. • Have good hands-on experience in creating technical design document, ETL architecture, ETL Process design, ETL logic conversion, Deployment, Production Data volume estimation. • Have good experience in WebServices, Technical Metadata consolidation, Defining Project Scope, Milestone based Deliverable Strategies, Database size estimation and Data Archival Methods. • Have good hands-on experience in designing and developing ETL Processes using Informatica mappings, Sessions, Event based Tasks, Workflows, Parameters and Scheduling scripts for ETL processes. • Have good experience in Change Data Capture Processes, Implementing Type 1, 2 SCD tasks. • Designed Data load solutions using External loaders, defining the ‘Distribution Key’ on partitioned Databases. • Designed Data Quality routines using precise SQL (Analytic functions, Aggregate, In-line views) PL/SQL stored procedures, Packages, Ad-hoc summary reports using OLAP functions and Oracle Analytical functions. • Have good experience in UNIX Shell scripting to provide effective solutions, creating Parameter files, Process Log files, Automated Archival of log files, Scheduling ETL jobs, SQL embedded shell scripts for Data warehouse environments. • Have good experience in IBM Websphere MQ Series and good knowledge in Message queue configuration, Reply to Server Queues, Message Types and exchanging Messages. • Performed Advanced level Data warehouse testing which verifies and validates the ETL Mappings, Stored Procedures, and OLAP Reports using SAMPLING techniques. • Designed and developed ‘eDATAGEN’ tool in DB2 SQL for voluminous production-like data generation for Load Testing phase. • Created Technical Design Documents, High-level ETL Design Document, Data Model Diagram, Data warehouse Test Strategy documents. • Have good knowledge in Data warehouse Concepts, Teradata BTEQ, Tpump utilities, SQL Loader, LOAD-DB2, RDBMS Objects, Dimensional Modeling, Estimate Data Granularity, DWH Methodologies by Ralph Kimball and W.H.Inmon, Domain experience in HealthCare, Retail, Finance, Banking, Insurance and Telecom. • Good Team player possessing good inter-personal, strong analytical and problem solving skills, Team building & Coaching, Client Management skills, Communication skills, proven leadership qualities and mentored a team of professionals towards Project Goal. Responsibilities: Analyze and Identify the Business & Technical requirements for MIDE Platform. Create the Data mapping specification from Wachovia Systems and Wells Fargo Home Mortgage systems for Data Analysts, Data Profilers, and Quality Team members. Create ETL Mapping specification Documents to load source data into Staging, Integrated layers for ETL Team. Architect, Design the ETL process to extract the data from disparate sources and load the data into DWH tables in Teradata Complex. Prepare the Gantt chart for Project Plan, Identify tasks, estimate Timelines, Milestones, Sequential/Parallel Activities, Subject Matter Experts, Resource Identification & Allocation. Identify the Data granularity and Design the Data model for staging, integrated layer and semantic layer of MIDE. Involved in Data modeling Design using TERADATA FS-LDM and provided Data granularity to accommodate time phased CDC requirements. Designed Data Quality Assessment for Source Systems in Mainframe - Easytrieve, Oracle, Db2, and SQL Server. Designed the logical schema and physical schema creation per subject area in TERADATA. Designed Partition Methods, Primary Index key identification, Indexing strategy for Fact Tables, Disk usage statistics and automated email notification to Support Team. Designed Mapping Templates using ‘Stencils’ and owned critical data movement mappings, Designed Mapplets, advanced Mappings using Source Qualifier, Expression, Filter, Look up, Update Strategy,Sorter,Joiner, Normalizer, Router and Data Masking transformations. Responsible for ETL, DB Design walk through, Implementation guide lines, Code release, delegating administration tasks. Design/Develop/Monitor/Schedule Workflows, Sessions, Event based tasks, parameter files, pre-session/post-session tasks for development/ UAT/SIT environment. Determine the ETL,DB access privilege to Team members, Power Exchange Definition migrations from SIT to PROD environment . Create ETL migration strategy document, ETL Metadata Design and Data cleansing artifacts using Profiling results. Developed Advanced BTEQ scripts to create Staging Schema, UAT Schema, SIT Schema, MLoad, Fload Scripts to load Integrated Layer. Created Autosys JIL , Unix shell scripts to FTP source files, validate source files, automated archival of Log files, create ETL event start/stop files. Performance Tuning on ETL Processes, Teradata Tables, Query Optimization. Code Review, Change Request validation and Change request Ticket creation, Ad-hoc Query packages performance optimization for OLAP reports. Recruit Team members, Project Orientation, Tool Upgrades, Evaluation, Licensing Recommendation, Offshore Team management, Task allocation and Deliverables Sign-off, Timesheet approval. Serve as the liaison between IT and the business sponsors; managing relationships with the multi-vendor technical teams. Project Management, Provided team of professionals with technical leadership on Data Integration, Industry best practices, Project auditing to achieve high performance. -- You received this message because you are subscribed to the Google Groups "American Vendor--IT Consulting" group. To post to this group, send email to sap-ven...@googlegroups.com. To unsubscribe from this group, send email to sap-vendor+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/sap-vendor?hl=en.