Location : Jersey City, New Jersey Local candidates only.Candidates MUST be able to conduct onsite interview.
Client : Brown Brothers Harriman & Co
In-Person Interview (Local only with DL)
Job Description
We are seeking a highly skilled Senior Data Engineer with 8+ years of hands-on experience in enterprise data engineering, including deep expertise in Apache Airflow DAG development, dbt Core modeling and implementation, and cloud-native container platforms (Kubernetes / OpenShift). This role is critical to building, operating, and optimizing scalable data pipelines that support financial and accounting platforms, including enterprise system migrations and high-volume data processing workloads.
The ideal candidate will have extensive hands-on experience in workflow orchestration, data modeling, performance tuning, and distributed workload management in containerized environments.
Key Responsibilities :
Data Pipeline & Orchestration
Design, develop, and maintain complex Airflow DAGs for batch and event-driven data pipelines
Implement best practices for DAG performance, dependency management, retries, SLA monitoring, and alerting
Optimize Airflow scheduler, executor, and worker configurations for high-concurrency workloads
dbt Core & Data Modeling
Deploy and manage data workloads on Kubernetes / OpenShift platforms
Design strategies for workload distribution, horizontal scaling, and resource optimization
Configure CPU/memory requests and limits, autoscaling, and pod scheduling for data workloads
Troubleshoot container-level performance issues and resource contention
Performance & Reliability
Monitor and tune end-to-end pipeline performance across Airflow, dbt, and data platforms
Identify bottlenecks in query execution, orchestration, and infrastructure
Implement observability solutions (logs, metrics, alerts) for proactive issue detection
Ensure high availability, fault tolerance, and resiliency of data pipelines
Collaboration & Governance
Experience
10+ years of professional experience in data engineering, analytics engineering, or platform engineering roles
Proven experience designing and supporting enterprise-scale data platforms in production environments
Must-Have Technical Skills
Expert-level Apache Airflow (DAG design, scheduling, performance tuning)
Expert-level dbt Core (data modeling, testing, macros, implementation)
Strong proficiency in Python for data engineering and automation
Deep understanding of Kubernetes and/or OpenShift in production environments
Extensive experience with distributed workload management and performance optimization
Strong SQL skills for complex transformations and analytics
Cloud & Platform Experience
Experience running data platforms on cloud environments
Familiarity with containerized deployments, CI/CD pipelines, and Git-based workflows
Preferred Qualifications
Experience supporting financial services or accounting platforms
Exposure to enterprise system migrations (e.g., legacy platform to modern data stack)
Experience with data warehouses (Oracle)
—