Alpharetta GA or Menlo Park CA
We have built job frameworks to run large scale ETL pipelines with Kafka, Elastic Search ELK, Snowflake, Hadoop.
Our applications run both on-perm and on the cloud. There are hundreds of dashboards built for business and operations to provide insight and actionable items at real-time.
We are looking for streaming, data engineer
– Understand distributed systems architecture, design and trade-off.
– Design and develop ETL pipelines with a wide range of technologies.
– Able to work on full cycle of development including defining requirement, design, implementation, testing, deployment.
– Strong communication skills to collaborate with various teams.
– Able to learn new technologies and work independently.
Requirements:
– 5 years of application development experience, at least 2 years data engineering with Kafka
– Working experience writing and running applications on Linux
– 5 years of coding experience with at least one of the languages: Python, Ruby, Java, C/C++, GO
– SQL and database experience
Optional:
– AWS or other cloud technologies
– ElasticSearch ELK
Contact Information
Email: rakesh.g@bcforward.com
Click the email address to contact the job poster directly.