Get C2C/W2 Jobs & hotlist update

Snowflake DBT Engineer C2C requirements Remote

Contract

Snowflake jobs

Snowflake DBT Engineer

Client LTIMindtree

Edison NJ remote

 

 

Job Description Senior Data Engineer ETLELT

Role Overview

We are seeking a Data Engineer to lead the design and implementation of scalable serverless data pipelines on AWS In this role you will function as a critical link between raw data sources and downstream consumers ensuring the delivery of high quality curated datasets You will collaborate within a crossfunctional team to empower data modelers CRM and Sales teams with reliable data assets

Responsibilities

What you will do

Analyze diverse source data systems and requirements to design robust ETL and ELT strategies

Ingest and stage data into Amazon S3 creating a foundation for scalable processing

Develop and maintain serverless pipelines that are faulttolerant scalable and distributed across multiple zones and regions

Build curated data layers specifically optimized for downstream data modeling and business applications like CRM and Sales

Main Responsibilities

Implement secure cloud architectures utilizing publicprivate subnets firewalls and PostgreSQL on Amazon RDS

Apply advanced data engineering practices including Data Lineage delta processing and complex partitioning

Manage the endtoend data lifecycle from initial ingestion to longterm archiving and retention

Optimize SQL queries and data workflows to ensure high performance and costefficiency

Core Skills and Methods

Cloud Orchestration AWS Step Functions Lambda and Glue for serverless automation

Data Processing PySpark EMR and PythonPandas for largescale distributed computing and data wrangling

Warehousing Snowflake architecture including data loading transformation and ELT patterns

Database Mastery Expertlevel SQL SQL optimization and familiarity with DBA tools for PostgreSQLRDS

Engineering Methods Aggregation complex transformations partitioning and incremental loading

Requirements

Proven experience building productiongrade ETLELT pipelines within the AWS ecosystem

Deep technical proficiency in Snowflake AWS Glue and Sparkbased processing

Strong understanding of network security including VPC configurations and secure data access

Ability to work collaboratively in a team environment to support downstream data consumers

What Makes You a Great Fit

Education Bachelors or Masters degree in Computer Science Data Engineering or a related technical field

Experience Approximately 5 years of handson experience in data engineering with a focus on highvolume environments

Certification AWS Certified Data Engineer or Solutions Architect professional status is highly preferred

Expertise A handson builder mindset with a track record of implementing multiregion highly available architectures

Additional Skills Familiarity with Infrastructure as Code TerraformCDK CICD for data and advanced data governance frameworks

Skills

Mandatory Skills : Snowflake

 

To apply for this job email your details to neha.chaudhary@compunnel.com

×

Post your C2C job instantly

Quick & easy posting in 10 seconds

Keep it concise - you can add details later
Please use your company/professional email address
Simple math question to prevent spam