Get C2C/W2 Jobs & hotlist update

Data Modeler C2C jobs with GCP :: Issaquah, WA

Contract

C2C requirements

Position: Data Modeler with GCP
Location: Issaquah, WA (Fully Onsite)

Terms: Long-Term-Contract

 

Job Summary:

We are seeking an experienced Data Modeler to design, implement, and optimize scalable data models across cloud platforms (Snowflake, Azure Synapse, and GCP). The ideal candidate will have strong experience in data architecture, dimensional modeling, metadata management, and data governance using tools like Erwin or ER/Studio. The role involves close collaboration with data engineers, architects, and business stakeholders to ensure data structures meet analytical and reporting needs.

 

Key Responsibilities:

Design and develop conceptual, logical, and physical data models to support data warehouse, data lake, and analytics initiatives.
Work with Snowflake, Azure Synapse Analytics, and Google Cloud BigQuery environments to implement optimized, scalable data structures.
Use ERwin Data Modeler (or equivalent tools) for schema design, version control, and metadata management.
Define and enforce data modeling standards, naming conventions, and best practices across the organization.
Collaborate with data engineering teams to translate models into ETL/ELT pipelines and ensure alignment with business requirements.
Optimize models for performance, scalability, and cost-efficiency in multi-cloud environments.
Participate in data governance initiatives including data lineage, cataloging, and master data management.
Work with data architects to design and integrate data from multiple sources (on-premises and cloud).
Conduct impact analysis for data model changes and manage schema evolution.
Support BI and analytics teams by ensuring high-quality, well-documented, and trusted data structures.
 

Required Skills & Qualifications:

Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field.
5+ years of experience as a Data Modeler / Data Architect in enterprise-scale environments.
Strong expertise in data modeling techniques — conceptual, logical, and physical.
Proven hands-on experience with:
Snowflake (schemas, views, data sharing, clustering, time travel, etc.)
Azure Synapse Analytics (dedicated SQL pools, data lake integration, pipelines)
Google Cloud Platform (BigQuery / Data Fusion / Cloud Storage)
ERwin Data Modeler (preferred) or ER/Studio.
Proficiency in SQL and understanding of data warehouse architectures (star schema, snowflake schema, normalized/denormalized models).
Experience with data governance, metadata management, and data cataloging tools (e.g., Collibra, Alation, or Purview).
Understanding of ETL/ELT design patterns, data quality, and data lineage.
Strong communication and documentation skills to collaborate with technical and business teams.

To apply for this job email your details to mayank.jaiswal@amaze-systems.com