Hiring for SR Data Engineer at Fremont, CA
Hope you are doing Good
I am Pavan Kumar – Technical Recruiter from Tanu Infotech Inc. We are a global staff augmentation firm providing a wide range of talent on-demand and total workforce solutions.
I have an opening for SR Data Engineer at Fremont, CA with our client and we found your resume is a great match for this role. If you are open to discuss in detail about this role feel free to reach me at 860-697-7343 with your best available time for a quick discussion.
Role: SR Data Engineer
Location: Fremont, CA
Duration: 6+ months (Extendable)
Client : Infostretch / Lucid Motivative
seeking a Data Engineer with a minimum of 8 years of experience. Please note that to start the role it will be remote for 3 – 6 months but once this Covid19 is behind us – it will be an in-house role.
Candidates must have good experience working in development projects along with enterprise systems with many interfaces.
Must have hands-on experience designing and developing streaming and IoT data pipelines. Be part of a group who will be building large and building fast, working with a very talented team of engineers and collaborating with the brightest mind in the Automotive industry.
·6 – 8+ years of experience in Data Engineering and Business Intelligence.
·Proficient in IoT tools such as MQTT, Kafka, Spark
·Proficient with AWS, S3, Redshift
·Experience with Presto and Parquet/ORC
·Proficient with Apache Spark and data frame.
·Experienced in containerization, including Docker and Kubernetes preferred
·Expert in tools such as Apache Spark, Apache Airflow, Presto
·Expert in design and implement reliable, scalable, and performant distributed systems and data pipelines
·Extensive programming and software engineering experience, especially in Java, Python,
·Experience with Columnar database such as Redshift, Vertica
·Great verbal and written communication skills.
·Bachelor or Masters in Software Engineering or Computer Science
·Hands-on design and develop streaming and IoT data pipelines.
·Developing streaming pipeline using MQTT, Kafka, Spark Structure Streaming
·Orchestrate and monitor pipelines using Prometheus and Kubernetes
·Deploy and maintain streaming jobs in CI/CD and relevant tools.
·Python scripting for automation and application development
·Design and implement Apache Airflow and other dependency enforcement and scheduling tools.
·Hands-on data modeling and data warehousing
·Deploy solution using AWS, S3, Redshift and Docker/Kubernetes
·Develop storage and retrieval system using Presto and Parquet/ORC
·Scripting with Apache Spark and data frame.
Thanks & Regards,
Tanu Infotech Inc