Get C2C/W2 Jobs & hotlist update

Snowflake Data Engineer C2c requirement urgent need

Position: Snowflake Data Engineer 

Location: Chicago IL (Onsite) – Need Local Profile

Duration: 6+ months

In-Person Interview

Pay Rate: $53/hr on C2C

 

Description:

 

Project Overview
This project aims to modernize our data architecture by leveraging cloud technologies, specifically Snowflake and Databricks, to enhance our data storage, processing, and analytics capabilities. It involves migrating application data from an on-premises Oracle database to a /more scalable, flexible, and secure cloud-based environment, optimizing data flows and analytics to support real-time decision-making and insights. 

 


Key Responsibilities
Data Pipeline Development: 
Design, build, and manage data pipelines for the ETL process, using Airflow for orchestration and Python for scripting, to transform raw data into a format suitable for our new Snowflake data model. 
Data Integration: 
Implement and maintain data synchronization between on-premises Oracle databases and Snowflake using CDC tools. 
Support Data Modeling: 
Assist in developing and optimizing the data model for Snowflake, ensuring it supports our analytics and reporting requirements. 
Reporting Support: 
Collaborate with data architect to ensure the data within Snowflake is structured in a way that supports efficient and insightful reporting. 
Technical Documentation: 
Create and maintain comprehensive documentation of data pipelines, ETL processes, and data models to ensure best practices are followed and knowledge is shared within the team.

 

Qualifications

 

Experience: Total 13+ years of experience as data engineer; proven experience as a Snowflake data engineer for 7+ years.

Skills: Strong knowledge of SQL writing complex queries, performance tuning, Strong experience with Oracle, Snowflake, ETL/ELT tools

Data engineering: proven track record of developing and maintaining data pipelines and data integration projects 
Orchestration Tools: Experience in Airflow for managing data pipeline workflows. 
Programming: proficiency in Python and SQL for data processing tasks. 
Data Modeling: Understanding of data modeling principles and experience with data warehousing solutions.

Cloud Platforms: Knowledge of cloud infrastructure and services, preferably Azure, as it relates to Snowflake integration. 
Collaboration Tools: Experience with version control systems (like Git) and collaboration platforms. 
CI/CD Implementation: Utilize CI/CD tools to automate the deployment of data pipelines and infrastructure changes, ensuring high-quality data processing with minimal manual intervention. 
Communication: Excellent communication and teamwork skills, with a detail-oriented mindset. Strong analytical skills, with the ability to work independently and solve complex problems.

Certifications: Snowflake certification is a plus. 

 

 

Education: At least a bachelor’s degree (or equivalent experience) in Computer Science, Software/Electronics Engineering, Information Systems, or closely related field is required.

 

 

 

 

 

Thanks, and regards

Ganesh Gorak

Itech Us Inc

Please share the resume at ganesh.g@itechus.net


 

APPLY NOW

🔔 Get our daily C2C jobs / Hotlist notifications on WHATSAPP 

WHATSAPP              TELEGRAM                  LINKEDIN


About Author

I’m Monica Kerry, a passionate SEO and Digital Marketing Specialist with over 9 years of experience helping businesses grow their online presence. From SEO strategy, keyword research, content optimization, and link building to social media marketing and PPC campaigns, I specialize in driving organic traffic, boosting rankings, and increasing conversions. My mission is to empower brands with result-oriented digital marketing solutions that deliver measurable success.

Leave a Reply

Your email address will not be published. Required fields are marked *

×

Post your C2C job instantly

Quick & easy posting in 10 seconds

Keep it concise - you can add details later
Please use your company/professional email address
Simple math question to prevent spam