Urgent Job Opening :: ETL Developer (Data warehouse) :: Denver, CO (onsite from Day 1/ initial 2 month remote)

Hello,

Hope you are doing well!

I am a Technical Recruiter with VITS and have an opportunity for a ETL Developer(Data warehouse)” :: Denver, CO (onsite from Day 1) came across your resume and think your profile will be a great fit for my client.

If you are interested, Please send me an updated copy of your resume along with the best time and no to reach you.

Role: ETL Developer (Data warehouse)”
Location: Denver, CO (onsite from Day 1)
Duration :
Long Term Contract

Note: Need resource from Data warehouse background from database side not from Informatica/Teradata.

Skillset / Qualifications:

  • 7+ years of experience in IT
  • Minimum 3+ years of experience as core ETL Developer
  • 3+ years of hands-on experience working with AWS technologies – Redshift, S3, EMR
  • Experience with native AWS technologies for data and analytics such as Glue Athena, 
  • 3+ years of hands-on experience in writing complex, highly-optimized SQL queries across large data sets.
  • Experience in distributed system concepts from a data storage and compute perspective (e.g. data lake architectures).
  • Extensive knowledge of coding languages, including Java, XML, and SQL.
  • Experience building enterprise-scale data warehouse and data lake solutions end-to-end.
  • Knowledgeable about a variety of strategies for ingesting, modelling, processing, and persisting data
  • Business Intelligence architecture, modeling and analysis
  • Good Experience with Python and UNIX shell scripting
  • ETL design, dimensional modeling, and cube design
  • Ability and desire to quickly learn new technologies
  • Sharp logical and analytical skills

Responsibilities

  • Support the operation of the Enterprise Data Warehouse and Data Lake in a global environment
  • Develop and maintain automated ETL pipelines for big data using scripting languages such as Python, Spark, SQL and AWS services such as S3, Redshift, Glue, Lambda, SNS, SQS, Kinesis, and Cloudwatch
  • Develop and maintain data security and permissions solutions for enterprise scale data warehouse and data lake implementations including data encryption and database user access controls and logging.
  • Develop data objects for business analytics using data modeling techniques.
  • Develop and optimize data warehouse and data lake tables using best practice for DDL, physical and logical tables, data partitioning, compression, and parallelization.
  • Develop and maintain data warehouse and data lake metadata, data catalog, and user documentation for internal business customers.
  • Help internal business customers develop, troubleshoot, and optimize complex SQL and ETL solutions to solve reporting, metrics, and analytics problems.
  • Work with internal business customers and software development teams to gather and document requirements for data publishing and data consumption via data warehouse, Data Lake, and analytics solutions.
  • Develop, test, and deploy code using internal software development toolsets. This includes the code for deploying infrastructure and solutions for secure data storage, ETL pipelines, data catalog, and data query.
  • Perform necessary system maintenance tasks, debugging, troubleshooting, and performance tuning
  • Create and maintain documentation for the Enterprise Data Warehouse environment
 
Thanks
Amit

amit@vitsus.com

To unsubscribe from future emails or to update your email preferencesclick here .

Leave a Reply