Regarding C2C Positions

Position:  Big Data Architect
Location: Stamford – CT
Duration: 6+months 
 Job Summary :Education: Employers commonly require IT Certifications or a Bachelor’s degree in Computer Science or in a related technical discipline and extensive work experience (10- 12+ years of overall IT experience with Big Data Analytics Data Warehousing and Business Intelligence) §
Technical skills: To work as a big data architect applicants are required to have expertise in Big Data tools and technologies and must be experienced in the following areas: Linux expertise experience in the set-up and administration of Hadoop platform working knowledge of tools like Hive Spark HBase Sqoop Impala Kafka Flume Oozie MapReduce etc. §
They are also required to have an understanding of scripting languages such as Java Scala Python or Shell Scripting etc. practical knowledge of end-to-end design and build process of Near-Real Time and Batch Data Pipelines expertise with SQL and Data modeling working in Agile development process. §
Knowledge: In addition to the technical skills they need to possess applicants must also have adept understanding of the various phases of the Software Development Life Cycle and familiarity using Source Code and Version Control systems like SVN Git etc. They must also be proficient with Hadoop ecosystem and architecture components § Ability to work with large data sets: Big Data involves large data sets so applicants must be able to work with highly diverse data in multiple types and formats and sources 
Roles & Responsibilities :Self-starter: They must be able to work with minimal supervision § Teamwork abilities: The big data architect must be able to work in a team-oriented environment with people of diverse skill sets § Analytical skills: It is important that big data architects can analyze complex problems using information provided understand customer requests and provide the appropriate solution § Communication skills: Big data architects are required to engage with clients and stakeholders to understand their objectives for setting up Big Data and apply it in the architecture. So it is essential that they have exceptional communication skills to communicate the essence of Big Data to a business and engage with various parties necessary for the execution of their job § Interpersonal skills: it is also important that they can interact with business/technical users as needed to gather Application Know-how necessary for design and development using Spark Spark streaming Kafka and other HaDoop tools.

2.  Position: Big Data Sr.Developer/Tech Lead
      Location: Long Beach – CA
      Duration: 6+months 
Strong knowledge on healthcare payer domain experience
2. Strong knowledge on Big data technologies Apache Spark Scala Hive Impala HUE Hadoop and Cloudera experience
3. Experience in Agile projects would be an added advantage 4. Experience in GitHub / Devops The associate would need to perform the role of a technical lead/senior developer at offshore as well as coordinate with onsite for delivery. Should have strong communication skills (verbal and written) and able to interpret business requirements quickly and come up with the technical design. Strong Healthcare background will be a value add to the profile especially on Payer Medicare and Medicaid data. Certification(s) Required : Any Big Data certifications preferable

Nityo Infotech Corp.
666 Plainsboro Road, 

Suite 1285

Plainsboro, NJ 08536

Thanks and Regards

Priya Gupta


Associate Recruiter

Fax : 609 799 5746



USA | Canada | India | Singapore | Malaysia | Indonesia | Philippines | Thailand  | UK | Australia / Zealand 

Nityo Infotech has been rated as One of the top 500 Fastest growing companies by INC 500


“If you feel you received this email by mistake or wish to unsubscribe, kindly reply to this email with “UNSUBSCRIBE” in the subject line”




To unsubscribe from future emails or to update your email preferencesclick here .

Leave a Reply