Get C2C/W2 Jobs & hotlist update

Urgent C2C Requirement :: Lead Data Engineer :: Houston, TX site

Lead Data Engineer

Location:  Houston, TX – On-site

Exp: 13+ Years

Looking locals in Houston

C2C (H1 ok)

 

Role Overview

Job Description

We are seeking a Lead Data Engineer with deep AWS expertise to guide the design, development, and optimization of our enterprise-scale data pipelines and products. In this role, you will not only contribute technically but also provide leadership to a team of data engineers, partner closely with data architects, and play a key role in planning, estimating, and resourcing major data initiatives. You’ll work on high-impact projects that integrate and transform large volumes of data from multiple enterprise systems into reliable, accessible, and high-quality data products that power analytics, reporting, and decision-making across the organization. 

 

Key Responsibilities:

·        Lead the end-to-end design, development, and optimization of scalable data pipelines and products on AWS, leveraging services such as S3, Glue, Redshift, Athena, EMR, and Lambda.

·        Provide day-to-day technical leadership and mentorship to a team of data engineers—setting coding standards, reviewing pull requests, and fostering a culture of engineering excellence.

·        Partner with data architects to define target data models, integration patterns, and platform roadmaps that align with AECOM’s enterprise data strategy.

·        Own project planning, estimation, resourcing, and sprint management for major data initiatives, ensuring on-time, on-budget delivery.

·        Implement robust ELT/ETL frameworks, including orchestration (e.g., Airflow or AWS Step Functions), automated testing, and CI/CD pipelines to enable rapid, reliable deployments.

·        Champion data quality, governance, and security; establish monitoring, alerting, and incident-response processes that keep data products highly available and trustworthy.

·        Optimize performance and cost across storage, compute, and network layers; conduct periodic architecture reviews and tuning exercises.

·        Collaborate with analytics, reporting, and business teams to translate requirements into reliable, production-ready data assets that power decision-making at scale.

·        Stay current with the AWS ecosystem and industry best practices, continuously evaluating new services and technologies to enhance Customer’s data platform.

·        Provide clear, concise communication to stakeholders at all levels, articulating trade-offs, risks, and recommendations in business-friendly language.

 

Requirements

Qualifications

Minimum Requirements:

·        Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, or a related discipline plus at least 8 years of hands-on data engineering experience, or demonstrated equivalency of experience and/or education

·        3+ years in a technical-lead or team-lead capacity delivering enterprise-grade solutions.

·        Deep expertise in AWS data and analytics services: e.g.; S3, Glue, Redshift, Athena, EMR/Spark, Lambda, IAM, and Lake Formation.

·        Proficiency in Python/PySpark or Scala for data engineering, along with advanced SQL for warehousing and analytics workloads.

·        Demonstrated success designing and operating large-scale ELT/ETL pipelines, data lakes, and dimensional/columnar data warehouses.

·        Experience with workflow orchestration (e.g.; Airflow, Step Functions) and modern DevOps practices—CI/CD, automated testing, and infrastructure-as-code (e.g.; Terraform or CloudFormation).

·        Experience with data lakehouse architecture and frameworks (e.g.; Apache Iceberg).

·        Experience in integrating with enterprise (onprem, SaaS) systems (Oracle e-business, Salesforce, Workday)

·        Strong communication, stakeholder-management, and documentation skills; aptitude for translating business needs into technical roadmaps.

 

Preferred Qualifications:

·        Solid understanding of data modeling, data governance, security best practices (encryption, key management), and compliance requirements.

·        Experience working within similarly large, complex organizations

·        Experience building integrations for enterprise back-office applications

·        AWS Certified Data Analytics – Specialty or AWS Solutions Architect certification (or equivalent) preferred; experience with other cloud platforms is a plus.

·        Proficiency in modern data storage formats and table management systems, with a strong understanding of Apache Iceberg for managing large-scale datasets and Parquet for efficient, columnar data storage.

·        In-depth knowledge of data cataloging, metadata management, and lineage tools (AWS Glue Data Catalog, Apache Atlas, Amundsen) to bolster data discovery and governance.

·        Knowledge of how machine learning models are developed, trained, and deployed, as well as the ability to design data pipelines that support these processes.

·        Experience migrating on-prem data sources onto AWS.

·        Experience building high quality Data Products.

 

 

 

Navneet Singh
US Technical Recruiter, Fast Hire INC

 navneet@fasthireinc.com

🔔 Get our daily C2C jobs / Hotlist notifications on 

WHATSAPP              TELEGRAM                  LINKEDIN


About Author

I’m Monica Kerry, a passionate SEO and Digital Marketing Specialist with over 9 years of experience helping businesses grow their online presence. From SEO strategy, keyword research, content optimization, and link building to social media marketing and PPC campaigns, I specialize in driving organic traffic, boosting rankings, and increasing conversions. My mission is to empower brands with result-oriented digital marketing solutions that deliver measurable success.

Leave a Reply

Your email address will not be published. Required fields are marked *

×

Post your C2C job instantly

Quick & easy posting in 10 seconds

Keep it concise - you can add details later
Please use your company/professional email address
Simple math question to prevent spam