Hi This is *Abdul Raheem* from *ExaTech Inc*. I have an immediate position mentioned below. Please review and let me know your interest.
*Skills* *Years of experience* *Self-ratings out of 10* Total IT Data Engineer GCP Teradata Data warehouse SQL *Please share the following details if you’re comfortable with the position* Passport Number: Contact Number: Current Location with Zip Code: Visa status: LinkedIn Id: Education Details: (Graduation Stream, University and Completion Year) Currently on project?: Total IT Experience: Total US Experience: Reason for change: *Role: GCP Teradata Engineer* Location: Dallas, TX (OR) Hartford, CT (Hybrid, Locals preferred) Duration: 12+ months Interview: Video *Job Description:* We are seeking a skilled Data Engineer to support a high-impact enterprise data migration initiative. The goal is to migrate data warehouse assets and ETL pipelines from Teradata to Google Cloud Platform (GCP). The role involves hands-on development, testing, and optimization of data pipelines and warehouse structures in GCP, ensuring minimal disruption and maximum performance. *Key Responsibilities:* · Lead and execute migration of data and ETL workflows from Teradata to GCP-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow). · Analyze and map existing Teradata workloads to appropriate GCP equivalents. · Rewrite SQL logic, scripts, and procedures in GCP-compliant formats (e.g., standard SQL for BigQuery). · Collaborate with data architects and business stakeholders to define migration strategies, validate data quality, and ensure compliance. · Develop automated workflows for data movement and transformation using GCP-native tools and/or custom scripts (Python/Java). · Optimize data storage, query performance, and costs in the cloud environment. · Implement monitoring, logging, and alerting for all migration pipelines and production workloads. *Required Skills:* · 12+ years of experience in Data Engineering, with at least 2 years in GCP. · Strong hands-on experience in Teradata data warehousing, BTEQ, and complex SQL. · Solid knowledge of GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc. · Experience with ETL/ELT pipelines using tools like Informatica, Apache Beam, or custom scripting (Python/Java). · Proven ability to refactor and translate legacy logic from Teradata to GCP. · Familiarity with CI/CD, Git, and DevOps practices in cloud data environments. · Strong analytical, troubleshooting, and communication skills. *Preferred Qualifications:* · GCP certification (e.g., Professional Data Engineer). · Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on GCP. · Experience working in the healthcare, retail, or finance domains. · Knowledge of data governance, security, and compliance in cloud ecosystems Thanks and Regards *Abdul Raheem, Sr. Talent Acquisition Lead* *Email: **[email protected] <[email protected]>* Skype & Hangout: [email protected] *4555 Lake Forest Drive, Suite 650 **| **Cincinnati, OH 45242 * An E-Verified Company USA-Canada-INDIA -- You received this message because you are subscribed to "rtc-linux". Membership options at http://groups.google.com/group/rtc-linux . Please read http://groups.google.com/group/rtc-linux/web/checklist before submitting a driver. --- You received this message because you are subscribed to the Google Groups "rtc-linux" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion visit https://groups.google.com/d/msgid/rtc-linux/CAJHuG4Pd2BbFFX5kYWUW6ofKSeSUP6MAzmH%2BLgO_qOWfEHVZcQ%40mail.gmail.com.
