Bigdata Hadoop Developer
Raleigh, NC
Contract
Job Description:
- 3 to 5 yrs of Big data experience.
- Extensive Experience with Spark & Scala and performance tuning is a MUST
- Experience in building a DataLake using BigData Ecosystems, ingesting huge volumes of data and performance tuning
- Good experience in Hive SQL developing and performance tuning.
- Management of Hadoop cluster, with all included services – preferably Hortonworks.
- Experience dealing with unstructured data formats while ingesting in to the platform and exposing for consumption purposes.
- Experience with Azure data bricks/delta and data factory is a huge plus
- Proficiency in Unix scripting Datawarehouse knowledge and ETL back ground preferred
This is the corp to corp opportunity , Please share updated Resume asap, thanks