
job post
Job Title ETL Developer
Location Chicago, Illinois
Job Description
Job description Project Name:
• FareWorks Initially till end of the year with possibility to extend. Not guaranteed at this moment.
Top 5 Skill sets
• 1. Python or PySpark
• 2.Complex SQL Development debugging optimization
• 3.AWS – Glue Step Functions
• 4.Knowledge of inner working of Databases – like AWS RDS MySQL
• 5. Big Data Processing
Nice to have skills or certifications:
• Experience as a lead for decent sized ETL team
• Experience with Apache Iceberg
• Observability tools like Dynatrace or DataDog
Preferred Hours of work: 8
• need to be flexible to support/troubleshoot issues during odd hours in case of incidents Must be Chicago location at this point we go to office every Tue/Wed alternate week but may change in future No travel at this time.
• Interview Process: How many interviews do you anticipate? 3 interviews
• Please do not send any candidate who does not have an extensive experience in building ETL jobs and not hands on coder for large data set in an enterprise setup.
Job Summary:
• An ETL developer needs to design build test and maintain systems that extract load and transform data from multiple different systems.
Primary Responsibilities:
• Leads Designs implements deploys and optimizes backend ETL services.
• Support a massive scale enterprise data solution using AWS data and analytics services.
• Analyze and interpret complex data and related systems and provides the efficient technical solutions.
• Provide support to ETL schedule and maintain compliance to same. –
• Develop and maintain standards to ETL codes and maintain an effective project life cycle on all ETL processes.
• Coordinate with cross functional teams like architects platform engineers other developers and product owners to build data processing procedures.
• Perform root cause analysis on production issues and perform routine monitoring on databases and provide support to ETL environments.
• Help create functional specifications technical designs and working with business process area owners.
• Implement industry best practices code and configuration for production and non-production environments in an highly automated environment.
• Provides technical advice effort estimate impact analysis.
• Provides timely project status and issue reporting to management.
Qualifications:
• 6 years experience using ETL tools to perform data cleansing data profiling transforming and scheduling various workflows.
• Expert level proficiency with writing debugging and optimizing SQL. – 3-4 years programming experience using Python or PySpark/Glue required.
• Knowledge of common design patterns models and architecture used in Big Data processing.
• 3-4 years’ experience with AWS services such as Glue S3 Redshift Lambda Step Functions RDS Aurora/MySQL Apache Iceberg CloudWatch SNS SQS EventBridge.
• Capable of troubleshooting common database issues familiarity with observability tools.
• Self-starter responsible professional and accountable.
• A finisher seeing a project or task through to completion despite challenges.
Thanks & Regard
To apply for this job email your details to Suhail.Ahmad@anviktek.com