GCP Data Architect Location :- Dearborn MI
Hello,
Position :- GCP Data Architect- Hadoop
Location :- Dearborn MI (Day 1 onsite)
- Good to have GCP Certification (Either GCP Data Engineer or GCP Cloud Architect)
- 15+ years of experience in Architecting Data projects and knowledge of multiple Hadoop/Hive/Spark/ML implementation
- 5+ experience in Data modeling and Data warehouse and Data lake implementation
- Working experience in implementing Hadoop to GCS and HIVE to Bigquery migration project
- Ability to identify and gather requirements to define a solution to be built and operated on GCP, perform high-level and low-level design for the GCP platform
- Capabilities to implement and provide GCP operations and deployment guidance and best practices throughout the lifecycle of a project.
- GCP technology areas of Data store, Big Query, Cloud storage, Persistent disk IAM, Roles, Projects, Organization.
- Databases including Big table, Cloud SQL, Cloud Spanner, Memory store, Data Analytics Data Flow, DataProc, Cloud Pub/Sub, Kubernetes, Docker, managing containers, container auto scaling and container security
- GCP technology areas of Data store, Big Query, Cloud storage, Persistent disk IAM, Roles, Projects, Organization.
- Databases including Big table, Cloud SQL, Cloud Spanner, Memory store, Data Analytics Data Flow, DataProc, Cloud Pub/Sub, Kubernetes, Docker, managing containers, container auto scaling and container security
- Experience in Design, Deployment, configuration and Integration of application infrastructure resources including GKE clusters, Anthos, APIGEE and DevOps Platform
- Application development concepts and technologies (e.g. CI/CD, Java, Python)
- Capabilities to implement and provide GCP operations and deployment guidance and best practices throughout the lifecycle of a project.
Pavnit Randhava| Recruiter| cyberThink |( 908.666.0622| * Prandhava@cyberthink.com |
Leader in Staffing and IT Services since 1996