Urgent requirement:::Big Data Solutions Architect Consultant::Big Plus is Cloudera Certified

Hi,

Title: Big Data Solutions Architect Consultant
Duration: 6+ Months
Location:NYC 
Rate: DOE

“Must have strong hands-on operations experience”.
Big Plus is Cloudera Certified.

Must have done platform implementations and setup Big Data.  Not looking for a Data Scientist.  

Background – Skills & Experience:
 
• Bachelor’s Degree required, preferably in Computer Science or a technology-related field.
• Five (5) to ten (10) years of relevant work experience.
• Strong written and verbal communication skills.
• Strong reporting and presentation skills.
• In depth knowledge of Kafka and object-oriented programming (“OPP”) experience (Java preferred) and development using Spark.
• Experience with other big data solutions such as Cloudera, Streamsets or NiFi, and Hortonworks.
• In depth knowledge of working on JSON based custom applications.
• Hadoop Distributed File System (“HDFS”) management and troubleshooting experience for large scale enterprises.
• Experience with modern application infrastructure methodologies such as ansible and Kubernetes deployment.
• Experience with CI/CD workflow and GIT source code management.
• Experience with modern orchestration platforms supporting Continuous Improvement / Continuous Delivery (“CI/CD”) workflows, such as ADO, Apache Airflow, Oozie.
• Technology stack experience with Streamsets, Airflow, NiFi, HDFS, Apache Kafka, Apache Sentry, Apache HIVE, YARN, Zookeeper, Elastic (e.g. ELK stack).
 
Responsibilities & Tasks:
 
• Perform cluster operations and management across big data enrichment and transformation components such as Streamsets, Cloudera managed services, and ElasticSearch.
• Perform cluster operations integrating with infrastructure and network components such RHEL components, Ansible YAML files, F5 components, and FW components.
• Fine tune and perform optimization of the data pipelines and its integrated components across the data lake
• fine tune integration components  and support custom applications developed to integrate with the Streamsets data pipelines across the data lake
• Manage and operate Cloudera manager and Streamsets, that are Kerberos integrated with active directory, deployed across multiple data center
• Upgrade data transformation platforms such as Streamsets Control hub, Data Collectors, Cloudera manager, various Kafka and Hadoop clusters. .
• Collaborate with developers and technical stakeholders assisting in onboarding to the data lake
• Provide guidance and assist in standardizing best practices across the data transformation projects  

Leave a Reply