Zenmid Sols
We are seeking a highly skilled Principal / Senior Data Engineer with strong hands-on expertise in data engineering and architecture. The ideal candidate will design, build, and optimize scalable data platforms on Azure, leveraging modern lakehouse architectures and enterprise data warehousing principles.
Key Responsibilities
Azure Cloud & Platform Engineering
Design and implement scalable solutions on Microsoft Azure (ADLS Gen2, Azure Data Factory, Synapse, Azure SQL, Key Vault, App Services)
Architect cloud-native data platforms with a focus on scalability, security, and cost optimization
Implement security and governance best practices (RBAC, Managed Identities, Private Endpoints)
Optimize cloud resource utilization and performance
Databricks & Lakehouse Development
Lead end-to-end development using Azure Databricks
Implement Lakehouse architecture using Delta Lake
Develop pipelines using Delta Live Tables (DLT) and manage governance via Unity Catalog
Perform cluster sizing, workload optimization, and performance tuning
Implement CI/CD pipelines for Databricks workloads
Big Data & Data Processing
Build scalable batch and streaming pipelines using Apache Spark (PySpark)
Develop Structured Streaming solutions
Optimize Delta Lake storage and query performance
Implement distributed data processing best practices
Programming & Query Optimization
Advanced development in Python for data engineering and automation
Expertise in PySpark for distributed data transformation
Strong SQL skills including complex query design, performance tuning, and analytics engineering
Data Architecture & Modeling
Design and implement Enterprise Data Warehouse (EDW) solutions
Apply dimensional modeling techniques (Star & Snowflake schemas)
Implement Data Vault 2.0 modeling frameworks
Build metadata-driven ingestion frameworks
Implement Change Data Capture (CDC)
Design and manage Medallion Architecture (Bronze/Silver/Gold layers)
Establish data lineage, cataloging, and governance frameworks
Support Master Data Management (MDM) initiatives
Data Products & Analytics Enablement
Design and deliver scalable, reusable data products
Build business-aligned semantic layers
Develop KPI frameworks to support enterprise reporting
Integrate ERP, SaaS, and operational systems
Architect hybrid Lakehouse + EDW environments
DevOps & Engineering Excellence
Implement CI/CD pipelines using Azure DevOps, GitHub Actions, or Bitbucket
Develop Infrastructure as Code using Terraform or ARM templates
Build automated testing frameworks (unit, integration, data quality)
Implement monitoring, logging, and observability solutions
Work within Agile and Scrum environments
Strategic & Leadership Responsibilities (Principal Level)
Define and lead enterprise data architecture strategy
Engage with cross-functional stakeholders to translate business needs into scalable data solutions
Drive technical roadmap planning and execution
Lead large-scale migration programs from legacy EDW platforms (Teradata, Oracle, SQL Server) to modern Lakehouse architectures
Implement governance frameworks aligned with GDPR, SOX, and enterprise compliance standards
Optimize large-scale distributed systems for performance and cost efficiency
To apply for this job email your details to vivek@zenmidsols.com