You will partner with a premier financial services client to rescue and revolutionize their data ecosystem. The client needs “people with brains” and a high level of ownership to manage critical Case Authorization and Data Reconciliation workflows.
Your mandate is dual-threaded:
- Technical Execution: Optimize heavy PySpark jobs on AWS that are currently “running over” due to volume and complexity.
- Consultative Leadership: Bridge the gap between the data platform and the business units, ensuring our technology directly drives business value.
The Consultative Mandate (Soft Skills & Leadership)
- Executive Presence & Communication: You must possess excellent communication skills, with the ability to “speak well” and present technical concepts to non-technical business partners. You are comfortable driving meetings, not just attending them.
- Radical Ownership: We value capability over years of experience. You must demonstrate a high level of ownership over production systems. When a job fails, you own the resolution, the root cause analysis, and the stakeholder communication.
- Outcome-Driven Mindset: You are not a task-taker. You act as a strategic advisor, helping the client prioritize the “right” work to drive the most value.
- Emotional Intelligence: You can navigate high-pressure environments with grace, managing expectations and pushing back on unrealistic requirements constructively.
- Mentorship: As a “mature” engineer, you naturally elevate the team around you, modeling best practices in code quality and error handling.
The Technical Core (Hard Skills)
- Advanced PySpark Optimization: You are an expert in Spark performance tuning. You can take complex business logic and optimize it for scale on AWS EMR and AWS Glue.
- Data Reliability & Reconciliation: You have experience building rigorous reconciliation frameworks to ensure data consistency between REST API sources and Snowflake targets.
- Snowflake: You have deep experience modeling and loading data into Snowflake (specifically mTw environments).
- Resilient Orchestration: You are proficient in Airflow, capable of designing DAGs with proper error handling to catch issues before they impact the business.
- Java Ecosystem Fluency: While the work is Python-based, you are comfortable working within a standard Java/AWS stack.
What You’ll Bring
- Proven Consulting Experience: Prior experience working in a consultative capacity (e.g., Big 4, boutique firm, or high-velocity agile environments) is highly preferred. You understand the “client-first” mentality.
- Optimization Expertise: A track record of taking slow, inefficient pipelines and making them performant.
- Business Acumen: The ability to understand the business logic (e.g., Case Authorization rules), not just the syntax.
Data Engineer || Dallas, TX (Hybrid) || In-person interview
In-person interview must
Local candidates only
Data Engineer
Dallas, TX (Hybrid)
The Engagement: High-Stakes Optimization
Thanks & Regards
Mohammad Faisal
