
Data Platform Developer
Job Title: Data Platform Developer
Location: Cranberry, PA
Duration: 6+ Months Contract to hire
Interview mode: Phone and Skype (Locals only)
Pay rate: $60/hr on C2C (Max)
Visa: USC or GC only
Experience level: 12+ years
Location: Cranberry, PA, need local candidates only it can be hybrid, 2-3 days a week.
Description:
Omnicell is the world leader in pharmacy robotics, and we’re expanding beyond inventory management into inventory analytics. The OmniSphere helps hospitals and health systems understand how meds flow through their business, from the loading dock to the nurse’s glove, and then apply clinical expertise and advanced machine learning to uncover opportunities to adjust that flow to improve safety, cost, efficiency, and patient outcomes. And the next step for us is to help busy clinicians act on those opportunities by building efficient, industry-leading workflows.
To do that, we take terabytes of data from thousands of devices and translate it to simple, actionable steps our clients can take to improve their overall performance. This is achieved through a sleek new microservices architecture primarily composed of Kafka, Spark, PostgreSQL, .NET Core, and Angular all running in AWS.
Responsibilities:
· Translate business requirements into effective technology solutions.
· Identify and establish technology choices (along with data architects) to enhance the data platform.
· Set coding and design standards
· Help lead the design, architecture and development of the Omnicell Data Platform
· Conduct design and code reviews
· Resolve defects/bugs during QA testing, pre-production, production, and post-release patches
· Analyze and improve efficiency, scalability, and stability of various system resources once deployed
· Provide technical leadership to agile teams – onshore and offshore: Mentor junior engineers and new team members, and apply technical expertise to challenging programming and design problems
· Help define the technology roadmap that will support the product development roadmap
· Continue to improve code quality by tracking, reducing and avoiding technical debt
· Focus on always putting the customer first.
Required Knowledge and Skills:
· Deep development experience of distributed/scalable systems and high-volume transaction applications, participating in architecting big data projects
· Hands-on programming experience in Scala, Python, and other object-oriented programming languages.
· Expert in using Big data technologies like, Apache Kafka, Apache Spark, Real Time streaming, Structured Streaming, Delta lake.
· Excellent analytical and problem-solving skills.
· Energetic, motivated self-starter that is eager to excel with excellent inter-personal skills.
· Expert in knowing a balance driving the right architecture but realizing the realities of having customers and the need to ship software
To apply for this job email your details to aakanksha@rconsultinginc.com