Sr. AWS Data Architect with Informatica MDM and IICS exp. at Chicago, IL ( Onsite with Hybrid and need locals only), 12+ months contract.

Hi All,
    We are working on the following role from our direct client. 
Role : Sr. AWS Data Architect.( Informatica MDM (Master Data Management) and IICS (Informatica cloud) along with snowflake and AWS Pipelines skills)
Location : Chicago, IL ( Onsite with Hybrid and need locals only)
Duration : 12+ months contract.

As a Data Architect, you will Lead the creation of the strategic enterprise data architecture for Hyatt. Partners with internal stakeholders to define the principles, standards, and guidelines regarding data flows, data aggregation, data migration, data curation, data model, data consumption and data placements. Provide expertise regarding data architecture in critical programs, data strategy and data quality remediation activities. Validates data architecture for adherence to defined policies, standards and guidelines including regulatory directives.

You will be a part of a ground-floor, hands-on, highly visible team which is positioned for growth and is highly collaborative and passionate about data.

This candidate builds fantastic relationships across all levels of the organization and is recognized as a problem solver who looks to elevate the work of everyone around them.

– Provides expert guidance to projects to ensure that their processes and deliverables align with the Hyatt target state architecture.

– Defines & develops enterprise data architecture concepts and standards leveraging leading architecture practices and advanced data technologies.

– Requirements gathering with business stakeholders, domain: agile team work, people data, and hierarchies of project portfolio work

– Ability to write requirements for ETL and BI developers

– Ability to write designs for data architecture of data warehouse or data lake solutions or end to end pipelines

– Expert in data architecture principles, distributed computing knowhow

– Intake prioritization, cost/benefit analysis, decision making of what to pursue across a wide base of users/stakeholders and across products, databases and services,

– Design or approve data models that provide a full view of what the Hyatt technology teams are working on and the business impact they are having.

– End to end data pipeline design, security review, architecture and deployment overview

– Automate reporting views used by management and executives to decide where to invest the organization’s time and resources, and stay up to date on key company initiatives and products

– Create self-service reporting including a data lake for Hyatt’s internal projects and resources

– Design for comprehensive data quality management tooling.

The ideal candidate demonstrates a commitment to Hyatt core values: respect, integrity, humility, empathy, creativity, and fun.

 

Qualifications:

Strong exp. on Informatica MDM (Master Data Management) and IICS (Informatica cloud) along with snowflake and AWS Pipelines skills

– 6+ years of experience within the field of data engineering or related technical work including business intelligence, analytics

– 4+ years of experience in architecture for commercial scale data pipelines

– Experience and comfort solving problems in an ambiguous environment where there is constant change. Have the tenacity to thrive in a dynamic and fast-paced environment, inspire change, and collaborate with a variety of individuals and organizational partners

– Experience designing and building scalable and robust data pipelines to enable data-driven decisions for the business

– Exposure to Amazon AWS or another cloud provider

– Experience with Business Intelligence tools such as Tableau, ThoughtSpot, PowerBI and/or Looker

– Familiarity with data warehousing platforms and data pipeline tools such as Redshift, Snowflake, SQL Server, etc.

– Passionate about programming and learning new technologies; focused on helping yourself and the team improve skills

– Effective problem solving and analytical skills. Ability to manage multiple projects and report simultaneously across different stakeholders

– Rigorous attention to detail and accuracy

– Aware of and motivated by driving business value

– Experience with large scale enterprise applications using big data open-source solutions such as Spark, Kafka, Elastic Search / Solr and Hadoop, HBase

– Experience or knowledge of basic programming and DB’s technologies (SQL, Python, Cassandra, PostgreSQL, AWS Aurora, AWS RDS , MongoDB, Redis, Couchbase, Oracle, MySQL, Teradata)

– Bachelor’s degree in Engineering, Computer Science, Statistics, Economics, Mathematics, Finance, a related quantitative field

– Advance CS degree is a plus knowledge and experience

Kind Regards, 

 

Ashwa Mohan
Delivery Lead- US Process ( Recruitment)

 

 

 

An ISO 9001:2015 Certified Company and Proud Member of NASSCOM

 

Phone919 439 3682 extn 9014

Email:  mohan@infomericainc.com

 

Our Locations: India | USA | UK

www.infomericainc.com

 

Follow us

 

 

This e-mail, and any attachment, is confidential. If you are not the intended recipient, please delete it from your system, do not use or disclose the information in any way, and notify the sender immediately. Any views expressed in this message are those of the individual sender and may not be the views of Infomerica, unless specifically stated. No warranty is made that the e-mail or attachment (s) are free from computer viruses or other defects.

About Author

JOHN KARY graduated from Princeton University in New Jersey and backed by over a decade, I am Digital marketing manager and voyage content writer with publishing and marketing excellency, I specialize in providing a wide range of writing services. My expertise encompasses creating engaging and informative blog posts and articles.
I am committed to delivering high-quality, impactful content that drives results. Let's work together to bring your content vision to life.

Leave a Reply

Your email address will not be published. Required fields are marked *