Location: Mountain View / Oakland CA (Mandatory 3 days/week onsite – No flexibility)
Job Type–Contract
Skills Required: GCP, MySQL, Python, Kafka, Terraform
Only local California candidate
Banking and financial sector experience is must.
Job Summary
- Data Pipeline Ownership: Design, develop, and maintain robust ETL (Extract, Transform, Load) data pipelines to process raw GCP Billing Export data and other large datasets.
- Cost Attribution Logic: Implement and optimize complex backend logic and data models to accurately attribute shared infrastructure costs (e.g., MySQL, Kafka, BigQuery, and GCS usage) to the appropriate business verticals.
- Backend Engineering: Own the development lifecycle for core backend services, ensuring high performance, scalability, and stability, with a strict focus on data accuracy.
- Organizational Mapping: Collaborate with finance and platform teams to integrate organizational structure and mapping into the cost attribution system.
- System Optimization: Perform deep-dive performance tuning on data processing jobs and database interactions to ensure efficient handling of large datasets.
- Infrastructure & Automation: Use infrastructure-as-code principles (e.g., Terraform) for managing underlying resources and develop automation scripts (e.g., Python, Bash) for operational tasks.
- Reliability & Monitoring: Implement monitoring and alerting for all pipelines to ensure data quality and uninterrupted service delivery.
- Documentation: Maintain up-to-date documentation and runbooks detailing the data models, ETL logic, and cost attribution methodology.
Required Qualifications
- Software Engineering Expertise: 6+ years of experience in backend software development, focusing on large-scale data processing and high-volume systems.
- Language Proficiency: Expert-level proficiency in at least one of the following: Python, GoLang, or Typescript.
- Cloud Cost Management: Hands-on experience with cloud financial data, specifically processing and utilizing GCP Billing Export data for cost analysis and attribution.
- Data Platform Experience: Strong familiarity and working experience with key data technologies, including MySQL, Kafka, BigQuery, and GCS.
- Backend & Data Processing: Deep understanding of backend engineering principles, data pipelines, ETL processes, and processing large, complex datasets.
- Cloud Infrastructure: Hands-on experience with Google Cloud Platform (GCP) services.
Thanks & Regards
Rahul Pandey
rahul.pandey@quantumworldit.com
Senior Technical Recruiter