Data Engineer with GCP
Remote
Long Term
Contract
Must Need 12+Years of experience
• Design, develop, and integrate production support technologies and systems to improve data pipeline reliability.
• Proactively identify vulnerabilities and implement systemic process-oriented solutions to enhance pipeline resilience.
• Develop and apply advanced analytics for proactive pipeline monitoring, anomaly detection, and performance optimization.
• Conduct in-depth statistical analysis of pipeline performance, incident data, and root causes to drive continuous improvement.
• Prepare and maintain comprehensive reports and dashboards with key performance indicators (KPIs) for measuring system and pipeline resiliency.
• Collaborate with global team of data analysts, engineers, and leaders to rapidly troubleshoot and resolve data pipeline issues.
• Interface with corporate areas, preparing technical documentation, reports, and presentations on pipeline health and improvements.
• Maintain accurate and up-to-date documentation for systems, processes, and troubleshooting guides.
• Assist in meeting data environment governance requirements, specifically related to pipeline data quality and integrity.
Key Relationships:
• IT
• Operations
• Global / Regional team members
• Suppliers, Integrators, Partners
Knowledge and Technical Competencies:
• Soft skills: Organized, proactive, curious, sense of urgency and prioritization, and a strong feeling of ownership.
• Technical expertise:
o Proven experience in systems support, data pipelines, database reliability engineering, and IT production support.
o Strong background in designing and implementing monitoring, alerting, and logging solutions.
o Proficiency with cloud platforms, specifically Google Cloud Platform (GCP).
o Experience with industrial data historians (e.g., OSI Soft PI) and API data extraction.
Education/Experience:
• Background: Degree in Engineering or Computer Science.
• Technical Skills:
o Proficiency in SQL (queries) and a scripting language (e.g., Python) for data analysis and automation.
o Hands-on experience with GCP data technologies (e.g., BigQuery, Dataflow) and monitoring tools (e.g., Grafana, Cloud Monitoring).
o Familiarity with business intelligence tools (e.g., Looker, Power BI, Tableau) for dashboarding.
• Desirable: Knowledge of process industry KPIs. Familiarity with data governance platforms (e.g., Collibra) and data transformation tools (e.g., Dataform).
Munesh
770-838-3829,
CYBER SPHERE LLC