Role: GCP Solution Engineer or GCP Data Engineer
Location: 100% Remote(Dallas, TX)
Job Description:
- 7+ years Data Platform and Data Application hands on experience
- Worked with Big Data Hadoop and Cloud solutions
- Worked with large multi tenancy environment
- Preferred GCP work experience with strong GCP Security and Access patterns hands-on
- End to end use case implementation, Ingestion to consumption
- Hands on with wide variety of use cases like ETL, Data Hubs, Data warehousing, Data lakes, Business Intelligence Analytics, Data Science Analytics, Operational Apps and APIs, Semantic and Reporting
- Good understanding of Enterprise grade Layered Data Architectures
- Good experience of solve business problem with conceptual and detail technical solution
- Liaison between Project team and business
- Health care and multi-Line of business use cases are preferred
Deliverables:
- Review current project solutions, document the proposed solution according to the platform standards, review with involved groups, help engineering teams implement the solution end to end with low level technical recommendations and code review.
- Document existing and new solution patterns.
- Revise proposed solution with Sol Arch, Proj Mgmt., Business Teams, Engg Teams, Data Security and Data Governance and collect approvals.
- Support any post approval solution changes until project successful in production.
- Work with multiple (4-5) projects at the same time
- Actively contribute to platform and process improvements.
Primary Must have Skills:
Data Engineering Stack like Spark, Kafka, Cloud Ops – AWS, Azure, or GCP