Hi,
Please share your daily requirements. *PROFESSIONAL SUMMARY: * · 7+ years of experience on Teradata, Business Intelligence, Data Warehousing, and Software Development Life Cycle (SDLC) concepts with emphasis on ETL and Life Cycle Development. · Extensive experience on Teradata architecture and data model concepts, worked on Teradata V2R6/V12/V13/V14 versions. · 6 years of experience in Implementation, Development and Deployment of the ETL tool – Informatica 9.x · 3 Years of experience in Development of ETL applications, Involving extraction, transformation and loading using Ab Initio. · Worked with Data Modelers, ETL staff, BI developers, and Business System Analysts in Business/Functional Requirements review and solution design process. · Expert in the field of design, development, and implementation of processes for Data Warehousing and Data Integration projects using Informatica Power Center and Power Exchange. · Excellent implementation of data warehouse/data marts using various data warehouse concepts and principles like Change Data Capture, Slowly Changing Dimensions, Normalization/Demoralization (Star and snow-flake Schema Modeling), and working with Fact and Dimension tables. · Extensive experiences in using Teradata Utility pack like , BTEQ (query/report writing), Teradata SQL Assistant (query), · Hand on using Teradata metadata services. · Well experienced in writing Scripts using Teradata parallel load Utilities like Teradata Tpump (Continuous Load),Teradata Fast Load (data loading),Teradata MultiLoad (multiple table loading),Teradata Fast Export (data extraction),Teradata Replication solution (real time data synchronization),Teradata Parallel Transporter. · Very strong on Teradata SQL query writing techniques. · Experience in using OLAP tools like Business Objects 6.0 /XIR2(Reports, Designer, Web Intelligence, Info view , and Supervisor) · Used SAS SQL for, retrieving Data from a Single/Multiple Tables, Creating and Updating Tables and Views, Programming with the SQL Procedure. · Worked with SAS ETL Studio to specify metadata for data sources, specify metadata for data targets and create jobs that specify how data is extracted, transformed, and loaded from sources to targets. · Very good in writing scripts in UNIX · Expert in Business Modeling Techniques using Process Flow Modeling and Data Flow Modeling. · Strong MVS knowledge and IBM mainframe expertise. * Technical Skills:* *Databases* Teradata V12/V13/V15*,* My SQL, MS Access, SQL Server, Oracle *Data Modeling/Tools* Erwin *Languages* SQL, UNIX Shell Script *Operating Systems* UNIX/Linux, Windows *Methodologies* Data Warehousing Design, Data Modeling, Logical and Physical Database Design Thanks & Regards, Vardhan Gemini Consulting & Services Phone : 314-720-1800 ext 301| Cell: 314-272-0792|Fax: 314-400-7696 Email: gg...@gemini-us.com|www.gemini-us.com Gtalk: Govardhan001 3636 S Geyer Road suite 270, Sunset Hills MO - 63127 [image: cid:part3.05090704.06020009@gemini-us.com][image: cid:part4.05080208.08090005@gemini-us.com][image: cid:part5.00010007.05040008@gemini-us.com][image: cid:part6.07030606.07050109@gemini-us.com] " Infinite faith and strength are the only conditions of Success" - CONFIDENTIALITY NOTICE -- This e-mail and the documents accompanying this transmission are confidential and may be a communication privileged or protected by law. It is meant for only the intended recipient. If you received this e-mail in error, any review, use, dissemination, distribution or copying of the e-mail is strictly prohibited. Please notify the sender immediately of the error by return e-mail and please delete the message from your system. Thank you in advance for your cooperation. -- You received this message because you are subscribed to the Google Groups "USITCV" group. To unsubscribe from this group and stop receiving emails from it, send an email to usitcv+unsubscr...@googlegroups.com. To post to this group, send email to usitcv@googlegroups.com. Visit this group at http://groups.google.com/group/usitcv. For more options, visit https://groups.google.com/d/optout.