Title: GCP Data Engineer
Location: Houston, Texas (Must be willing to work 4 days a week onsite in downtown, Houston)
Type of Hire: Contract
Work Authorization: Any
GCP Data Engineer:
The client’s Data Analytics team operates under a product platform operating model and functions as a product team composed of multiple sub-teams — four of which are data engineering teams. These are organized by functional areas within CLIENT’S, including Gas and Electric Operations, Customer Finance, Marketing, Supply Chain, and CRM.
• We work in an agile environment following two-week sprint cycles, where each sprint focuses on specific tasks and user stories that are delivered either to the business or to other internal product teams.
• While most of the work centers around data engineering, we also handle pipeline development and application development, including establishing frameworks that support AI/ML models.
• Our Business Analytics teams are the primary consumers of the data pipelines we build.
• They leverage these pipelines to create Power BI and Tableau reports and custom dashboards that serve various business functions.
• The pipelines are developed by integrating multiple systems into BigQuery, using API streaming technology and various GCP data processing services.
• Our data warehouse runs on BigQuery, and most of our services are designed to support it — meaning the majority of pipelines are built around BigQuery.
• In addition, we maintain an Azure framework for certain application-specific solutions.
• While Azure does not serve as an enterprise data warehouse, there is extensive integration between Azure and GCP, requiring data movement and synchronization across both platforms.
• We are also transitioning our infrastructure management to Terraform with Git integration, moving away from our previous deployment approach using Azure DevOps for object deployment.
Thanks & Regards
![]()
|
|

