Get C2C/W2 Jobs & hotlist update

Urgent need Data Engineer C2C jobs in Florida

Contract

c2c it jobs

Data Engineer 
Orlando Florida (Need Local only)
3-6 Months
Contract

Financial experience is a must have

 

Strong SQL and Snowflake expertise, including performance tuning and data modeling.

Proficient in Python for scripting, automation, and working with REST APIs.

Experience with Apache Airflow for orchestration and workflow monitoring.

Hands-on with dbt for modular, version-controlled data transformations.

Solid experience with AWS services (e.g., S3, Lambda, IAM, CloudWatch) in data engineering workflows.

Experience integrating and processing data from REST APIs.

Understanding of data quality, governance, and cloud-native troubleshooting.

 

 

“Must have below:

10+ Years Experience

Great Communicator/Client Facing

Individual Contributor

100% Hands on in the mentioned skills

DBT Proficiency: model development:

Experience in creating complex DBT models including incremental models, snapshots and documentation. Ability to write and maintain DBT macros for reusable code

Testing and documentation:

Proficiency in implementing DBT tests for data validation and quality checks

Familiarity with generating and maintaining documentation using DBT’s built in features

Version control:

Experience in managing DBT projects using git ,including implementing CI/CD process from the scratch

 

 

AWS Expertise:

Data STORAGE solutions:

In depth understanding of AWS S3 for data storage, including best practices for organization and security

Experience with AWS redshift for data warehousing and performance optimization

Data Integration:

Familiarity with Aws glue for ETL processes and orchestration -Nice to have

Experience with AWS lambda for serverless data processing tasks

Workflow Orchestration:

Proficiency in using Apache Airflow on AWS to design ,schedule and monitor complex data flows

Ability to integrate Airflow with AWS services and DBT models such as triggering a DBT model or EMR or reading from s3 writing to redshift

Data Lakes and Data warehousing:

Understanding the architecture of data lakes vs data warehouses and when to use each

Experience with amazon Athena for querying data directly in s3 using SQL

Monitoring and Logging:

Familiarity with AWS cloud watch for monitoring the pipelines and setting up alerts for workflow failures

Cloud Security:

Knowledge of AWS security best practices ,including IAM roles, encryption, DBT profiles access configurations

 

Programming Skills:

 

 

Python:

Proficiency in Pandas and NumPy for data analysis and manipulation

Ability to write scripts for automating ETL processes and scheduling jobs using airflow

Experience in creating custom DBT macros using jinja and Python allowing for reusable components within dbt models

Knowledge on how to implement conditional logic in DBT through python

SQL:

Advanced SQL skills, including complex joins ,window functions, CTE’s and subqueries

Experience in optimizing SQL queries for performance and optimization”

 

To apply for this job email your details to Munesh@cysphere.net