Job Description: 1) Need a resource with experience in Hadoop. 2) Good understanding of Hadoop ecosystem Yarn architecture. 3) Writing high performance Hive queries. 4) Hands on experience in Spark with Python/Scala. 5) Hands on with loading and manipulating large data sets using Spark/SQL Hive. 6) Knowledge on debugging and troubleshooting Hadoop jobs. 7) Good communication skills and client interfacing skill Prepare implementation plan as per the need and build the in scope applications in Big Data technologies. 8) Responsible for all technical delivery of project. 9) Good understanding of Agile DevOps methodology Good Communication soft skills. 10) Prior experience with US customer is nice to have. 11) Should have worked in offshore delivery model. 12) Should be strong in Unix SQL..
To unsubscribe from future emails or to update your email preferencesclick here .