
c2c hiring
Job Title: Lead Data Engineer With AI ML
Location: Los Angeles, CA (Onsite/Hybrid as per project needs)
Job Type: Contract
Job Description:
12+ years of experience as a Data Engineer or similar role.
Strong programming skills in Python.
Hands-on experience with PySpark and PostgreSQL.
Proven experience in building data pipelines and managing large-scale data processing.
Deep understanding of AWS services (S3, ECS, Lambda, etc.).
Proficiency with Airflow for workflow orchestration.
Experience with Docker and containerized data processing environments.
Strong communication and problem-solving skills.
Key Responsibilities:
Design and Develop Data Pipelines:
Develop and maintain scalable, efficient data pipelines using Python and PySpark to process large datasets.
Cloud Integration:
Work with AWS services including S3, CloudWatch, ECS, ECR, and Lambda to implement and manage cloud-based data solutions.
Database Management:
Design, implement, and optimize PostgreSQL databases ensuring performance, availability, and security.
Workflow Orchestration:
Utilize Apache Airflow for scheduling, monitoring, and maintaining complex data workflows.
Containerization:
Implement and manage containerized applications using Docker for consistent and scalable environments.
Data Quality and Governance:
Ensure data accuracy, consistency, security, and compliance across all data systems.
Cross-functional Collaboration:
Partner with data scientists, analysts, and business stakeholders to gather requirements and deliver effective data solutions.
To apply for this job email your details to Akash.rai@nityo.com