Position- Banking Domain — Data Engineer— 15+–
Location- Remote
Contract
Role Overview
We are seeking a highly skilled Data Engineer with strong experience in Snowflake and Python to support the design, modernization, and optimization of compliance data platforms.
This role will focus on building scalable data models, streamlining legacy-to-target data pipelines, and implementing robust ETL workflows within Snowflake. The ideal candidate brings a strong understanding of data architecture principles, regulatory/compliance data requirements, and hands-on engineering capabilities.
Key Responsibilities
Data Modeling & Architecture (Compliance Focus)
Design and implement scalable data models to support compliance, regulatory, and risk reporting needs
Develop logical and physical data models within Snowflake
Define data domains, lineage, and element-level mapping across systems
Ensure data consistency, traceability, and auditability
Align data architecture with enterprise governance standards
Current State & Target State Analysis
Perform detailed analysis of existing data pipelines at the data element level
Document source-to-target mappings and transformation logic
Identify inefficiencies, redundancies, and data quality gaps
Design streamlined target-state architecture and optimized data flows
Support data rationalization and consolidation initiatives
ETL / ELT Engineering (Hands-On)
Develop and optimize ETL/ELT pipelines using Python and Snowflake
Implement transformation logic using Snowflake SQL and Python-based frameworks
Build scalable ingestion and transformation workflows
Optimize performance and cost within Snowflake
Implement incremental loading and validation strategies
Data Quality & Controls
Implement validation checks and reconciliation processes
Ensure regulatory-grade accuracy and completeness
Support audit and compliance requirements
Maintain documentation for data lineage and transformations
Required Qualifications
Technical Skills
Strong programming expertise in Python
Hands-on experience with Snowflake (data modeling, performance tuning, query optimization)
Advanced SQL skills
Experience building production-grade ETL/ELT pipelines
Strong understanding of data modeling (3NF, dimensional modeling)
Experience working with large, complex datasets
Architecture & Analysis
Experience performing source-to-target mapping at the data element level
Understanding of data lineage and metadata management
Ability to analyze and redesign legacy pipelines into modern architectures
Familiarity with compliance or regulatory data environments (preferred)
Preferred Qualifications
Experience in financial services, risk, or regulatory reporting environments
Experience with workflow orchestration tools (Airflow or similar)
Familiarity with data quality frameworks
Exposure to cloud platforms (AWS/Azure/GCP)
Experience integrating structured and semi-structured data
Role Overview
We are seeking a highly skilled Data Engineer with strong experience in Snowflake and Python to support the design, modernization, and optimization of compliance data platforms.
This role will focus on building scalable data models, streamlining legacy-to-target data pipelines, and implementing robust ETL workflows within Snowflake. The ideal candidate brings a strong understanding of data architecture principles, regulatory/compliance data requirements, and hands-on engineering capabilities.
Key Responsibilities
Data Modeling & Architecture (Compliance Focus)
Design and implement scalable data models to support compliance, regulatory, and risk reporting needs
Develop logical and physical data models within Snowflake
Define data domains, lineage, and element-level mapping across systems
Ensure data consistency, traceability, and auditability
Align data architecture with enterprise governance standards
Current State & Target State Analysis
Perform detailed analysis of existing data pipelines at the data element level
Document source-to-target mappings and transformation logic
Identify inefficiencies, redundancies, and data quality gaps
Design streamlined target-state architecture and optimized data flows
Support data rationalization and consolidation initiatives
ETL / ELT Engineering (Hands-On)
Develop and optimize ETL/ELT pipelines using Python and Snowflake
Implement transformation logic using Snowflake SQL and Python-based frameworks
Build scalable ingestion and transformation workflows
Optimize performance and cost within Snowflake
Implement incremental loading and validation strategies
Data Quality & Controls
Implement validation checks and reconciliation processes
Ensure regulatory-grade accuracy and completeness
Support audit and compliance requirements
Maintain documentation for data lineage and transformations
Required Qualifications
Technical Skills
Strong programming expertise in Python
Hands-on experience with Snowflake (data modeling, performance tuning, query optimization)
Advanced SQL skills
Experience building production-grade ETL/ELT pipelines
Strong understanding of data modeling (3NF, dimensional modeling)
Experience working with large, complex datasets
Architecture & Analysis
Experience performing source-to-target mapping at the data element level
Understanding of data lineage and metadata management
Ability to analyze and redesign legacy pipelines into modern architectures
Familiarity with compliance or regulatory data environments (preferred)
Preferred Qualifications
Experience in financial services, risk, or regulatory reporting environments
Experience with workflow orchestration tools (Airflow or similar)
Familiarity with data quality frameworks
Exposure to cloud platforms (AWS/Azure/GCP)
Experience integrating structured and semi-structured data