Location – Dallas, Texas (Onsite Role) Need Locals ONSITE INTERVIEW
Visa – USC, GC, GC EAD, H4 EAD, H1B
Duration – 12+ Months contract
• 4/5 days onsite a week
• Good solid command on SQL is critical. GCP questions are minimal
Job Description:
The hiring leader is looking for someone with experience in Google Big Query, GCP, ETL Pipeline, BI, Cloud Skills, Microsoft SQL with experience in both building and designing.
The requirements are below:
Looking for a Data engineer, to collaborate in an agile team of peers developing cloud based analytics platform integrating data from broad amount of systems to enable next-gen analytical products.
Data Engineering Google Cloud Platform (GCP) Engineer is responsible to develop and deliver effective cloud solutions for different business units. They will collaborate with cross-functional teams to design, implement, and manage scalable and reliable cloud solutions.
Our future colleague will :
· Contribute to multiyear data analytics modernization roadmap for the bank.
· You will directly work on the platform based on Google BigQuery and other GCP services to integrate new data sources and model the data up to the serving layer.
· Contribute to this is unique opportunity as the program is set-up to completely rethink reporting and analytics with Cloud technology.
· Collaborate with different business groups, users to understand their business requirements and design and deliver GCP architecture, Data Engineering scope of work
· You will work on a large-scale data transformation program with the goal to establish a scalable, efficient and future-proof data & analytics platform.
· Develop and implement cloud strategies, best practices, and standards to ensure efficient and effective cloud utilization.
· Work with cross-functional teams to design, implement, and manage scalable and reliable cloud solutions on GCP.
Qualifications
What will help you succeed:
· Bachelors University degree computer science/IT
· Masters in Data Analytics/Information Technology/Management Information System (preferred)
· Strong understanding of data fundamentals, knowledge of data engineering and familiarity with core cloud concepts
· Must have good implementation experience on various GCP’s Data Storage and Processing services such as BigQuery, Dataflow, Bigtable, Dataform, Data fusion, cloud spanner, Cloud SQL
· Must have programmatic experience of SQL, Python, Apache Spark
· At least 7-8 years of professional experience in building data engineering capabilities for various analytics portfolio with at least 2-3 years in GCP/Cloud based platform.
Your expertise in one or more of the following areas is highly valued:
· Google Cloud Platform, ideally with Google BigQuery, Cloud Composer and Cloud Data Fusion, Cloud spanner, Cloud SQL
· Experience with legacy data warehouses (on SQL Server or any Relational Datawarehouse platform)
· Experience with our main tools dbt, terraform/terragrunt, Git (CI/CD)
· Experience with a testing framework
· Experience with Business Intelligence tools like PowerBI and/or Looker
What sets you apart:
· Experience in complex migrations from legacy data warehousing solutions or on-prem datalakes to GCP
· Experience with building generic, re-usable capabilities and understanding of data governance and quality frameworks
· Experience in building real-time ingestion and processing frameworks on GCP.
· Adaptability to learn new technologies and products as the job demands.
· Multi-cloud & hybrid cloud experience
· Any cloud certification (Preference to GCP Certifications)
· Experience working with Financial and Banking Industry
Thanks & Regards,
Abhishek Bhartiya | Sr. Technical Recruiter
—
——————– US STAFFING ESSENTIALS ————————————–
For daily US JOBS / Updated Hotlist / Post hotlist / Vendor Lists from the trusted sources
For Everything in US Staffing Just Search on google ” C2C HOTLIST ” for daily 5000+ US JOBS and Updated 10000+ Hotlists.
Have you Checked this No.1 US Staffing Whatsapp Channel for Daily C2C Jobs/ Hotlists and Top US staffing Telegram Channel of 50k American vendors