Snowflake DBT Data Engineer
Contract
Location Irvine CA 5 Days onsite
“Key Responsibilities
Design develop and maintain ELT pipelines using Snowflake and DBT
Build and optimize data models in Snowflake to support analytics and reporting
Implement modular testable SQL transformations using DBT
Integrate DBT workflows into CICD pipelines and manage infrastructure as code using Terraform
Collaborate with data scientists analysts and business stakeholders to translate requirements into technical solutions
Optimize Snowflake performance through clustering partitioning indexing and materialized views
Automate data ingestion and transformation workflows using Airflow or similar orchestration tools
Ensure data quality governance and security across pipelines
Troubleshoot and resolve performance bottlenecks and data issues
Maintain documentation for data architecture pipelines and operational procedures
Required Skills Qualifications
Bachelors or Masters degree in Computer Science Data Engineering or related field
7 years of experience in data engineering with at least 2 years focused on Snowflake and DBT
Strong proficiency in SQL and Python
Experience with cloud platforms AWS GCP or Azure
Familiarity with Git CICD and Infrastructure as Code tools Terraform CloudFormation
Knowledge of data modeling star schema normalization and ELT best practices”
Contact Information
Email: neha.chaudhary@compunnel.com
Click the email address to contact the job poster directly.