
Website Skyroot Innovation LLC
What You’ll Do:
As a Data Engineer, you’ll work alongside architects, analysts, and scientists to design efficient data flows, enhance data quality, and ensure secure, reliable pipelines for both real-time and batch processing environments. You will:
Develop, optimize, and maintain ETL/ELT pipelines for structured and unstructured dataUse tools like dbt, Informatica, Azure Data Factory, and Databricks for transformation and workflow automationWrite performant SQL queries and Python/JavaScript scripts for data parsing, ingestion, and cleanupConduct data profiling, linkage, validation, and quality checks across diverse sourcesEnable cloud-based data processing (e.g., AWS S3, Azure Blob) and support API integrationCollaborate across data science, engineering, and stakeholder teams to deliver accurate and actionable data productsMaintain technical documentation for pipelines, data structures, and infrastructureSupport secure data governance practices and performance tuning in modern cloud platforms
Required Qualifications:
Bachelor’s degree in Computer Science or a related field3+ years of relevant experience in data engineering or data pipeline developmentExperience with SQL, Python, and cloud platforms like Azure, AWS, or GCPFamiliarity with ETL tools and frameworks such as dbt, Informatica, or Apache Airflow
Preferred Qualifications:
Experience with modern data warehousing (Snowflake, Redshift) and orchestration tools (e.g., Spark, Airflow)Understanding of compliance and security protocols (e.g., HIPAA, CCPA, NIST)Strong data modeling skills and ability to translate requirements into scalable data solutionsExcellent communication skills and ability to work in cross-functional environments
To apply for this job email your details to hr@skyitinfo.com