Title: Senior/Lead Data Engineer
Location: 100% Remote – PST Hours (8 AM – 5 PM PST)
Job Type: Contract – 12 Months
Key Skills: Sql, PySpark, Azure, Microsoft Fabric, ADF, and Databricks.
About the Role:
We are seeking a highly skilled and experienced Senior/Lead Data Engineer to join our dynamic team at KPI, working on challenging and multi-year data transformation projects. This is an excellent opportunity for a talented data engineer to play a key role in building innovative data solutions using Azure Native Services and related technologies. If you are passionate about working with large-scale data systems and enjoy solving complex engineering problems, this role is for you.
Key Responsibilities:
- Data Engineering : Design, development, and implementation of data pipelines and solutions using PySpark, SQL, and related technologies.
- Collaboration: Work closely with cross-functional teams to understand business requirements and translate them into robust data solutions.
- Data Warehousing: Design and implement data warehousing solutions, ensuring scalability, performance, and reliability.
- Continuous Learning: Stay up to date with modern technologies and trends in data engineering and apply them to improve our data platform.
- Mentorship: Provide guidance and mentorship to junior data engineers, ensuring best practices in coding, design, and development.
Must-Have Skills & Qualifications:
- Minimum 10+ years of overall experience in IT Industry.
- 8+ years of experience in data engineering, with a strong background in building large-scale data solutions.
- 4+ years of hands-on experience developing and implementing data pipelines using Azure stack experience (Azure, ADF, Databricks, Functions)
- 1-2 years of experience in Microsoft Fabric
- Proven expertise in SQL for querying, manipulating, and analyzing large datasets.
- Strong knowledge of ETL processes and data warehousing fundamentals.
- Self-motivated and independent, with a “let’s get this done” mindset and the ability to thrive in a fast-paced and dynamic environment.
Good-to-Have Skills:
- Databricks Certification is a plus.
- Data Modeling, Azure Architect Certification