Get C2C/W2 Jobs & hotlist update

GCP Data Engineer // Dallas, TX or Hartford, CT

Job title            : GCP Data Engineer

Location          : Dallas, TX or Hartford, CT, (If candidates is good can ask for remote )

Experience     : 10+

 

PASSPORT NUMBER MANDATORY

 

The goal is to migrate data warehouse assets and ETL pipelines from Teradata to Google Cloud Platform (GCP).

The role involves hands-on development, testing, and optimization of data pipelines and warehouse structures in GCP, ensuring minimal disruption and maximum performance.

 

Key Responsibilities:

 

•  Lead and execute migration of data and ETL workflows from Teradata to GCP-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow).

•  Analyze and map existing Teradata workloads to appropriate GCP equivalents.

•  Rewrite SQL logic, scripts, and procedures in GCP-compliant formats (e.g., standard SQL for BigQuery).

•  Collaborate with data architects and business stakeholders to define migration strategies, validate data quality, and ensure compliance.

  Develop automated workflows for data movement and transformation using GCP-native tools and/or custom scripts (Python/Java).

•  Optimize data storage, query performance, and costs in the cloud environment.

• Implement monitoring, logging, and alerting for all migration pipelines and production workloads. _____

 

Required Skills:


•  6+ years of experience in Data Engineering, with at least 2 years in GCP.

•  Strong hands-on experience in Teradata data warehousing, BTEQ, and complex SQL.

•  Solid knowledge of GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc.

•  Experience with ETL/ELT pipelines using tools like Informatica, Apache Beam, or custom scripting (Python/Java).

•  Proven ability to refactor and translate legacy logic from Teradata to GCP.

•  Familiarity with CI/CD, Git, and DevOps practices in cloud data environments.

•  Strong analytical, troubleshooting, and communication skills. _____

 

Preferred Qualifications:

 

•  GCP certification (e.g., Professional Data Engineer).

•  Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on GCP.

•  Experience working in the healthcare, retail, or finance domains.

• Knowledge of data governance, security, and compliance in cloud ecosystems.

 

Thanks & Regards,

Priya, Sr. Resource Management Executive

priya@mygallega.com


A:  4080 McGinnis Ferry Road, Suite # 1302
      Alpharetta, GA-30005.

:

:
:
:
    
🔔 Get our daily C2C jobs / Hotlist notifications on 

WHATSAPP              TELEGRAM                  LINKEDIN
   

About Author

I’m Monica Kerry, a passionate SEO and Digital Marketing Specialist with over 9 years of experience helping businesses grow their online presence. From SEO strategy, keyword research, content optimization, and link building to social media marketing and PPC campaigns, I specialize in driving organic traffic, boosting rankings, and increasing conversions. My mission is to empower brands with result-oriented digital marketing solutions that deliver measurable success.

Leave a Reply

Your email address will not be published. Required fields are marked *