Get C2C/W2 Jobs & hotlist update

Databricks Architect

Contract
  • Contract
  • Remote

Droisys

About Company

Droisys is an innovation technology company focused on helping companies accelerate their digital initiatives from strategy and planning through execution. We leverage deep technical expertise, Agile methodologies, and data-driven intelligence to modernize systems of engagement and simplify human/tech interaction. Amazing things happen when we work in environments where everyone feels a true sense of belonging and when candidates have the requisite skills and opportunities to succeed. At Droisys, we invest in our talent and support career growth, and we are always on the lookout for amazing talent who can contribute to our growth by delivering top results for our clients. Join us to challenge yourself and accomplish work that matters 

Role : Databricks Architect 

Location : Edison NJ Remote 

Pay rate: $75/hr on c2c

 

Key Responsibilities

Design develop and maintain data ingestion pipelines using Databricks and Lakeflow Connect

Integrate data from various structured and unstructured sources into Delta Lake and other data storage systems

Implement realtime and batch ingestion workflows to support analytics and reporting needs

Optimize data ingestion performance ensuring scalability reliability and cost efficiency

Collaborate with data architects analysts and business stakeholders to define data requirements and ingestion strategies

Ensure data quality lineage and governance compliance across the ingestion process

Automate data ingestion monitoring and error handling mechanisms

Stay up to date with emerging Databricks Lakehouse and data integration technologies and best practices

 

Required Qualifications

Bachelors or master’s degree in computer science information systems data engineering or a related field

4 years of experience in data engineering or ETL development

Handson experience with Databricks SQL PySpark Delta Lake

Proficiency with Lake flow Connect for building and managing data ingestion workflows

Strong understanding of data integration patterns data modeling and data Lakehouse architectures

Experience with cloud platforms Azure AWS or GCP and associated data services

Knowledge of CICD version control Git and infrastructure as code practices

Familiarity with data governance security and compliance standards

Preferred Skills

Experience with streaming technologies Kafka Event Hubs etc

Knowledge of REST APIs and connector based ingestion

Exposure to machine learning data pipelines in Databricks

Strong problem-solving communication and collaboration skills

 

 

 

Droisys is an equal opportunity employer. We do not discriminate based on race, religion, color, national origin, gender, gender expression, sexual orientation, age, marital status, veteran status, disability status or any other characteristic protected by law. Droisys believes in diversity, inclusion, and belonging, and we are committed to fostering a diverse work environment

 

To apply for this job email your details to nikita.b@droisys.com

×

Post your C2C job instantly

Quick & easy posting in 10 seconds

Keep it concise - you can add details later
Please use your company/professional email address
Simple math question to prevent spam