Urgent Azure Data Engineer (Databricks, Scala) – Remote only

Contract

c2c contract jobs

Role: Sr. Azure Data Engineer (Databricks, Scala)

Location: Remote

Duration: 6 months+

Need 13+ years profiles only.

 

Job description:

  • ideally 13+ years of total IT experience in SWD or engineering and ideally 3+ years of hand-on experience designing and building scalable data pipelines for large datasets on cloud data platforms
  • ideally 3+ years of hand-on experience in distributed processing using Databricks, Apache SparkKafka & leveraging Airflow scheduler/executor framework
  • ideally 2+ years of hand-on experience programming experience in Scala (must have), Python & Java (preferred)
  • ideally 2+ years of hand-on experience programming experience in stream data to Azure, including using Azure Event Hubs, Azure SQL Edge, and Azure Stream Analytics
  • experience with monitoring solutions such as Spark Cluster Logs, Azure Logs, AppInsights, Grafana to optimize pipelines and knowledge in Azure capable languages,  Scala or Java
  • proficiency at working with large and complex code base management systems like: Github/Gitlab, Gitflow as a project commiter at both command-line and IDEs levels using: tools like: IntelliJ/AzureStudio
  • experience working with Agile development methodologies and delivering within Azure DevOps, automated testing on tools used to support CI and release management
  • expertise in optimized dataset structures in Parquet and Delta Lake formats, with ability to design and implement complex transformations between datasets
  • expertise in optimized Airflow DAGS and branching logic for Tasks to implement complex pipelines and outcomes and expertise in both traditional SQL and NO-SQL authorship

To apply for this job please visit corptocorp.org.