Hi,
Hope you’re doing well! I’m reaching out about a GCP Data Architect opportunity with RelantoAI based in Fremont, CA (Hybrid). We’re looking for someone with 10+ Years of experience in designing, implementing, and managing data architectures on Google Cloud Platform (GCP).
Role: GCP Data Architect
Location: Bay Area, CA (Hybrid)
Experience: 13+ Years
Job Type: Contract/Fulltime/W2
Duration: Long Term
About the Role:
We are seeking a highly skilled Data Architect to join our Data & AI team in the U.S. The ideal candidate will have deep expertise in designing, implementing, and managing data architectures on Google Cloud Platform (GCP). This role involves leading the end-to-end design of data solutions, defining data governance frameworks, and enabling the organization to leverage data for advanced analytics, AI, and business intelligence initiatives.
Key Responsibilities:
- Architect Scalable Data Solutions: Design and implement high-performance, secure, and scalable data architectures on GCP to meet business needs.
- Data Strategy & Governance: Define and enforce data architecture frameworks, standards, and best practices including data modeling, metadata management, lineage, and security.
- Solution Delivery: Lead the design, build, and deployment of cloud-based data platforms using services such as BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage.
- Data Integration & Quality: Oversee data ingestion, transformation, and curation pipelines ensuring data accuracy, consistency, and performance.
- Collaboration & Leadership: Partner with business stakeholders, product teams, and engineers to translate requirements into data-driven solutions.
- Technical Mentorship: Provide guidance and mentorship to data engineers and analysts, promoting architectural excellence and best practices.
- Performance Optimization: Continuously improve the reliability, scalability, and performance of data platforms and processes
What You Bring:
- 8+ years of experience in data architecture, engineering, or analytics, with at least 2 years in a GCP environment.
- Strong proficiency with GCP data services (BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Composer, Cloud SQL, etc.)
- Hands-on experience with ETL/ELT frameworks, data warehousing, and real-time data streaming.
- Strong understanding of data modeling (3NF, Dimensional, Data Vault) and data lake / data mesh architectures.
- Languages & Tools: Proficiency in SQL, Python, and tools like Terraform or Deployment Manager for Infrastructure as code.
- Design & Governance: Experience establishing data standards, governance frameworks, and security best practices.
- Collaboration: Excellent communication skills with the ability to work across technical and business teams.
Why Join Us:
- Work on cutting-edge cloud data projects driving innovation and insights.
- Collaborate with a talented, cross-functional Data & AI team.
- Opportunity to shape the organization’s data strategy and influence enterprise-wide architecture decisions.
Looking forward to qualified local submissions only.
Thanks & Regards,
|
|||||||||||||||||||||||||||