Get C2C/W2 Jobs & hotlist update

Data Architect

Data Architect

Chicago or Michigan
Long Term

Contract

 

Data Architect to design, build, and optimize our modern data ecosystem. The ideal candidate will have deep experience with AWS cloud services, Snowflake, and dbt, along with a strong understanding of scalable data architecture, ETL/ELT development, and data modeling best practices.

________________________________________

Key Responsibilities

•                Architect, design, and implement scalable, reliable, and secure data solutions using AWS, Snowflake, and dbt.

•                Develop end-to-end data pipelines (batch and streaming) to support analytics, machine learning, and business intelligence needs.

•                Lead the modernization and migration of legacy data systems to cloud-native architectures.

•                Define and enforce data engineering best practices including coding standards, CI/CD, testing, and monitoring.

•                Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and translate them into technical solutions.

•                Optimize Snowflake performance through query tuning, warehouse sizing, and cost management.

•                Establish and maintain data governance, security, and compliance standards across the data platform.

•                Mentor and guide junior data engineers, providing technical leadership and direction.

________________________________________

Required Skills & Qualifications

•                12+ years of experience in Data Engineering, with at least 3+ years in a cloud-native data environment.

•                Hands-on expertise in AWS services such as S3, Glue, Lambda, Step Functions, Redshift, and IAM.

•                Strong experience with Snowflake – data modeling, warehouse design, performance optimization, and cost governance.

•                Proven experience with dbt (data build tool) – model development, documentation, and deployment automation.

•                Proficient in SQL, Python, and ETL/ELT pipeline development.

•                Experience with CI/CD pipelines, version control (Git), and workflow orchestration tools (Airflow, Dagster, Prefect, etc.).

•                Familiarity with data governance and security best practices, including role-based access control and data masking.

•                Strong understanding of data modeling techniques (Kimball, Data Vault, etc.) and data architecture principles.

________________________________________

Preferred Qualifications

•                AWS Certification (e.g., AWS Certified Data Analytics – Specialty, Solutions Architect).

•                Strong communication and collaboration skills, with a track record of working in agile environments.

 

 

Munesh

770-838-3829,

munesh@cysphere.net

munesh.reddy.us@gmail.com

CYBER SPHERE LLC

 

About Author

I’m Monica Kerry, a passionate SEO and Digital Marketing Specialist with over 9 years of experience helping businesses grow their online presence. From SEO strategy, keyword research, content optimization, and link building to social media marketing and PPC campaigns, I specialize in driving organic traffic, boosting rankings, and increasing conversions. My mission is to empower brands with result-oriented digital marketing solutions that deliver measurable success.

Leave a Reply

Your email address will not be published. Required fields are marked *

×

Post your job intantly

Please use your company/professional email address