Big Data Developer: BS/BA degree or equivalent experience Advanced knowledge of application, data, and infrastructure architecture disciplines Understanding of architecture and design across all systems Working proficiency in developmental toolsets Knowledge of industry-wide technology trends and best practices Ability to work in large, collaborative teams to achieve organizational goals Passionate about building an innovative culture Proficiency in one or more modern programming languages Understanding of software skills such as business analysis, development, maintenance, and software improvement 4+ years of software design and application development experience using any OOPs Languages (C#, C++ JAVA, etc.…) 2+ years’ experience with any of the following data orchestration stack, ingestion frameworks, cloud-native patterns: Apache Airflow, Kafka, Nomad/Terraform, Kubernetes/Docker, etc. 2+ years of experience with scripting languages like Python, Bash or PowerShell, etc. Hands-on experience with any of the “open-source” distributed ingestion/processing stack like Hadoop, Spark and Kafka is a must. Solid exposure working in data engineering team, and can demonstrate best practices for building robust data controls and governance practices Experience with infrastructure automation technologies like Docker and K8s is huge plus. Exposure with building APIs and services using REST. Designed and developed scalable data solutions using cloud infra such as AWS, Azure or GCP. Ability to manage multiple tasks and thrive in a fast-paced team environment. Strong written and verbal communications skills. Working knowledge of Agile and scrum practices. |