Job Description – Data Architect (Azure & GCP)
Location – Issaquah, WA (Onsite)
Contract
Role Overview:
As a Data Architect specializing in Microsoft Azure and Google Cloud Platform (GCP), you will be responsible for designing, implementing, and optimizing enterprise-grade data architectures that enable advanced analytics, AI/ML, and business intelligence across multi-cloud environments. This role requires deep expertise in data engineering, data modeling, cloud-native architecture, and governance to support scalable, secure, and high-performance data ecosystems.
Key Responsibilities
- Architect and Design: Define end-to-end data architecture strategies, including ingestion, storage, processing, and consumption layers leveraging Azure Synapse Analytics, Azure Data Factory, Databricks, BigQuery, Dataflow, and Pub/Sub.
- Cloud Data Integration: Design and build hybrid and multi-cloud data solutions integrating structured, semi-structured, and unstructured data sources across Azure and GCP.
- Data Modeling: Develop conceptual, logical, and physical data models supporting data warehouses, data lakes, and lakehouse architectures.
- ETL/ELT Pipelines: Lead the design and implementation of data pipelines using ADF, Synapse Pipelines, GCP Dataflow, and Cloud Composer, ensuring scalability and maintainability.
- Analytics & BI Enablement: Support data consumption through Power BI, Looker, or Tableau, enabling advanced analytics and machine learning initiatives.
- Data Governance & Security: Implement data governance frameworks, metadata management, lineage, and compliance with GDPR, HIPAA, or SOX using tools like Purview and Data Catalog.
- Performance Optimization: Optimize data pipelines, queries, and storage performance across Azure Data Lake Storage (ADLS), BigQuery, and Cloud Storage.
- Collaboration: Partner with data engineers, analysts, and business stakeholders to translate business requirements into scalable data architecture solutions.
- Infrastructure as Code (IaC): Utilize Terraform, ARM templates, and Deployment Manager for automating data infrastructure deployment.
- Innovation & Best Practices: Drive adoption of data mesh, data fabric, and lakehouse paradigms to modernize enterprise data landscapes.
Technical Skills
- Cloud Platforms: Azure (Synapse, Data Factory, Databricks, ADLS, Purview), GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Dataproc, Composer).
- Data Engineering: SQL, Python, PySpark, Airflow, Kafka, Beam.
- Data Modeling Tools: ER/Studio, PowerDesigner, dbt, or similar.
- Data Governance: Azure Purview, Google Data Catalog, Collibra, Alation.
- DevOps & IaC: Terraform, GitHub Actions, Jenkins, Azure DevOps.
- Databases: SQL Server, PostgreSQL, Cosmos DB, Spanner, Firestore.
- Visualization: Power BI, Looker, Tableau.
Preferred Qualifications
- Bachelor’s or Master’s in Computer Science, Data Engineering, or a related field.
- 8–15 years of experience in data architecture and engineering.
- Certifications: Microsoft Certified: Azure Data Engineer/Architect, Google Professional Data Engineer/Architect.
- Experience working with multi-cloud and hybrid data ecosystems.
- Strong understanding of data governance, lineage, and privacy frameworks.
Pratik Kumar | Senior Talent Acquisition Specialist
Amaze Systems Inc
USA: 8951 Cypress Waters Blvd, Suite 160, Dallas, TX 75019
Canada: 55 York Street, Suite 401, Toronto, ON M5J 1R7
