Get C2C/W2 Jobs & hotlist update

Data Engineer || Dallas, TX (Hybrid) || In-person interview

In-person interview must 

Local candidates only 
Data Engineer 
Dallas, TX (Hybrid)
The Engagement: High-Stakes Optimization

You will partner with a premier financial services client to rescue and revolutionize their data ecosystem. The client needs “people with brains” and a high level of ownership to manage critical Case Authorization and Data Reconciliation workflows.

Your mandate is dual-threaded:

  1. Technical Execution: Optimize heavy PySpark jobs on AWS that are currently “running over” due to volume and complexity.
  2. Consultative Leadership: Bridge the gap between the data platform and the business units, ensuring our technology directly drives business value.

The Consultative Mandate (Soft Skills & Leadership)

  • Executive Presence & Communication: You must possess excellent communication skills, with the ability to “speak well” and present technical concepts to non-technical business partners. You are comfortable driving meetings, not just attending them.
  • Radical Ownership: We value capability over years of experience. You must demonstrate a high level of ownership over production systems. When a job fails, you own the resolution, the root cause analysis, and the stakeholder communication.
  • Outcome-Driven Mindset: You are not a task-taker. You act as a strategic advisor, helping the client prioritize the “right” work to drive the most value.
  • Emotional Intelligence: You can navigate high-pressure environments with grace, managing expectations and pushing back on unrealistic requirements constructively.
  • Mentorship: As a “mature” engineer, you naturally elevate the team around you, modeling best practices in code quality and error handling.

The Technical Core (Hard Skills)

  • Advanced PySpark Optimization: You are an expert in Spark performance tuning. You can take complex business logic and optimize it for scale on AWS EMR and AWS Glue.
  • Data Reliability & Reconciliation: You have experience building rigorous reconciliation frameworks to ensure data consistency between REST API sources and Snowflake targets.
  • Snowflake: You have deep experience modeling and loading data into Snowflake (specifically mTw environments).
  • Resilient Orchestration: You are proficient in Airflow, capable of designing DAGs with proper error handling to catch issues before they impact the business.
  • Java Ecosystem Fluency: While the work is Python-based, you are comfortable working within a standard Java/AWS stack.

What You’ll Bring

  • Proven Consulting Experience: Prior experience working in a consultative capacity (e.g., Big 4, boutique firm, or high-velocity agile environments) is highly preferred. You understand the “client-first” mentality.
  • Optimization Expertise: A track record of taking slow, inefficient pipelines and making them performant.
  • Business Acumen: The ability to understand the business logic (e.g., Case Authorization rules), not just the syntax.
Thanks & Regards
Mohammad Faisal

:

:
:
:
    
🔔 Get our daily C2C jobs / Hotlist notifications on 

WHATSAPP              TELEGRAM                  LINKEDIN
   

About Author

I’m Monica Kerry, a passionate SEO and Digital Marketing Specialist with over 9 years of experience helping businesses grow their online presence. From SEO strategy, keyword research, content optimization, and link building to social media marketing and PPC campaigns, I specialize in driving organic traffic, boosting rankings, and increasing conversions. My mission is to empower brands with result-oriented digital marketing solutions that deliver measurable success.

Leave a Reply

Your email address will not be published. Required fields are marked *

×

Post your C2C job instantly

Quick & easy posting in 10 seconds

Keep it concise - you can add details later
Please use your company/professional email address
Simple math question to prevent spam