- Location: Remote
- Client: Cognizant/Centerpoint Energy
Job Summary
We are looking for an experienced GCP Data Engineer to design, build, and maintain data pipelines and architectures on Google Cloud Platform. The ideal candidate will have strong expertise in GCP services, hands-on experience with both batch and real-time data processing, and a proven ability to translate business needs into scalable and reliable technical solutions. A GCP Professional Data Engineer Certification is required for this role.
Key Responsibilities
- Design, develop, and maintain data acquisition pipelines for large-scale structured and unstructured data.
- Build and manage complex datasets and scalable big data pipeline architectures in GCP.
- Work with GCP tools including Python, DataFlow, DataStream, Cloud Functions, Pub/Sub, BigQuery, and Cloud Storage.
- Use SAP SLT to replicate SAP tables into GCP.
- Apply DevOps and CI/CD practices (GitHub, Terraform) for automation.
- Optimize datasets with partitioning, clustering, IAM roles, and Policy Tags for security and performance.
- Implement monitoring solutions using logs and alerts for pipeline performance.
- Continuously recommend improvements in data quality, governance, and efficiency.
Required Skills & Experience
- 4+ years of professional experience as a Data Engineer.
- GCP Professional Data Engineer Certification is required.
- Strong expertise in the Google Cloud Platform (GCP) ecosystem.
- Proficiency in Python for data engineering tasks.
- Experience with batch and real-time data processing.
- Hands-on experience with SAP SLT replication.
- Strong understanding of DevOps and CI/CD practices (GitHub, Terraform).
Thanks,
_______________________________________
Aditya Jain | New York Technology Partners
120 Wood Avenue S | Suite 504 | Iselin NJ 08830