usa c2c jobs
Senior Snowflake Data Engineer
Location: Fort Mill, SC (Hybrid)
Position type: Contract
Job Description
· Snowflake, Snowpark: The candidate should have a deep understanding of Snowflake data warehousing platform and be proficient in using Snowpark for data processing and analytics.
· AWS services (Airflow): The candidate should have hands-on experience with AWS services, particularly Apache Airflow for orchestrating complex data workflows and pipelines.
· AWS services (Lambda): Proficiency in AWS Lambda for serverless computing and event-driven architecture is essential for this role.
· AWS services (Glue): The candidate should be well-versed in AWS Glue for ETL (Extract, Transform, Load) processes and data integration.
· Python: Strong programming skills in Python are required for developing data pipelines, data transformations, and automation tasks.
· DBT: Experience with DBT (Data Build Tool) for modeling data and creating data transformation pipelines is a plus.
Responsibilities:
· Design, develop, and maintain data pipelines and ETL processes using Snowflake, AWS services, Python, and DBT.
· Collaborate with data scientists and analysts to understand data requirements and implement solutions.
· Optimize data workflows for performance, scalability, and reliability.
· Troubleshoot and resolve data-related issues in a timely manner.
· Stay updated on the latest technologies and best practices in data engineering.
Qualifications:
· Bachelor’s degree in Computer Science, Engineering, or related field.
· Proven experience in data engineering roles with a focus on Snowflake, AWS services, Python, and DBT.
· Strong analytical and problem-solving skills.
· Excellent communication and teamwork abilities.
· AWS certifications (e.g., AWS Certified Data Analytics – Specialty) are a plus.
Click here for More remote and onsite Contract / Fulltime USA JOBS
To apply for this job email your details to darshan@nytpcorp.com