BI DATA GOVERNANCE_ARCHITECT_MODELER AVAILABLE FOR CORP TO CORP REQUIREMENTS.

 

 

(THIS IS OUR CONSULTANT RESUME, NOT A REQUIREMENT, DON’T SEND ME RESUME AGAIN)

 

 

Hi Friendz

 

Hope you are doing stupendous!

 

 

Please find enclosed DATA GOVERNANCE ARCH/ MODELER CONSULTANT RESUME.

 

Please let us know if you have any requirements for him.

 

SUMMARY

  • Overall 11+ years of Business Intelligence and Experience in Information Technology with intense expertise as Data Architect/Governance, Data Modeler and Business Objects developer.
  • Experience in Leading Data Governance Team to create Standards and Policies for the four pillars of Data Governance.
  • Experience in creating, designing the Data Architecture for various size of projects.
  • Experienced in developing Conceptual, logical models and physical database design for Online Transactional processing (OLTP) and Online Analytical Processing (OLAP) systems using Erwin and ER studio.
  • Strong understanding of Data warehouse, Data Vault concepts and Key Value pair database
  • Implemented Big data solutions for the Risk, Finance and fraud team with the help of Hadoop and Hive.
  • Developed data model and data marts for Mortgage, Retail Banking and Consumer Banking.
  • Injection, Migration of relational data bases to Hadoop Data Eco system.
  • Well versed in Normalization / De normalization techniques for optimum performance in relational database environments.
  • Experienced in generating and documenting Metadata while designing OLTP and OLAP systems environment.
  • Experience in creating models for RDBMS such as Oracle/Teradata/ SQL Server/ Cassandra.
  • Experience in creating models for Non SQL such as Hadoop/Hive, Kafka, and Cloud.
  • Experience in developing Entity-Relationship diagrams and modeling Transactional Databases and Data Warehouse using tools like ERWIN, ER Studio.
  • Good in Data warehouse loads, determining hierarchies, building various logics to handle Slowly Changing Dimensions.
  • Efficient in analyzing and documenting business requirement documents (BRD) and functional requirement documents (FRD) along with Use Case Modeling and UML.
  • Proficient in Oracle Tools and Utilities such as TOAD, SQL*Plus and SQL Developer.
  • Experience in design reviewing and validation of final data models.
  • Knowledge on visualization tool like ALTERYX.
  • Experienced in developing and designing enterprise architecture using technique like relational data model (ERD) and logical data model using concept of normalization of third normal, object oriented data modeling and designing data marts using dimensional models/star schema/snow-flakes.
  • Well-seasoned Business Objects ( XI3.1, XIR2) Consultant with experience in Designer, Info View, Web Intelligence, Desktop Intelligence and Query as a web service.
  • Designed Universes using operational data sources like MS Access.
  • Involved in gathering, analyzing, and documenting business requirements, functional requirements and data specifications for Business Objects Universes and Reports. Documented the SDLC process definitions.
  • Developed Xcelsius Dashboards using Query as a web Service (QAAWS) and live office components of Business Objects.
  • Good understanding and knowledge with Agile and waterfall methodologies.
  • Dedicated and a very good team player with the ability to work independently.   

  

Technical Skills

Big Data Technologies

Hadoop, Hive 

Cloud Technologies

AWS (Certified cloud practitioner), Kafka

OLAP/Reporting Tools

Business Objects (4.0, XI 3.1, 3.0, R2) Designer, Web Intelligence, Info View, Business Query, Crystal Reports, CMC.

Dashboard Tools

Xcelsius 2008, Query as a web service (QAAWS), Live Office.

Modeling Tools

Erwin V7.2/8.2/9.5, ER Studio 10.

Programming Languages

SQL, PL/SQL, C++, Java and Microsoft Technologies

Open Source/Application Server Technologies

IIS, Tomcat, HTML and XML 

RDBMS

Oracle 12c/10g/9i,Teradata 12/14/16,SQL Server 2014/2008/2005, MySQL, Cassandra.

CRM Application

Siebel 7.0

Operating Systems

Windows 2003/XP/2000/NT/98, UNIX and Linux

 

Company Profile & Project Details

 

Data Solution Architect /Data Governance                                        April 2019 to Till Date

Financial Doamin      

 

Roles & Responsibilities:

  • Created Standards, Policies for the four pillars of Data Governance and delivered a Scalable model for Cloud Migration Project.
  • Architected and Designed for Machine Learning Model for Fraud Risk Analysis.
  • Created LOB level Data Standards as part of Data Strategy plan for Consumer and Commercial Banking.
  • Gathered the business requirements by conducting a series of meetings with business users.
  • Designed Bigdata solutions for the Risk, Finance and fraud team using Hadoop and Hive.
  • Developed data model and data marts for Mortgage, Retail Banking and Consumer Banking.
  • Developing models for Injection/ Migration of relational data bases to Hadoop Data Eco system and AWS Data pond.
  • Designed data model to support Teradata from snowflake model.
  • Analyzing the datasets and Providing PCI Classification and bucket prefixes while migrating to AWS.
  • Providing Data governance approvals for migration of datasets.
  • Classifying objects in Unified Data Services for security restrictions.
  • Analyzing the data to incorporate in the model.
  • Created Dimensional model for a third-party vendor data which is in snowflake database to Teradata and Hadoop/Hive for Mortgage business analytical reporting.
  • Worked on DMOD requirement documents to capture the business metadata for all attributes.
  • Supporting various teams across the LOB.
  • Created logical data model from the conceptual model and converted into the physical database design using Erwin following the Firm standards.
  • Identified objects and relationships and how those all fit together as logical entities, these are then translated into physical design using in Erwin tool.
  • Applying naming Conventions to standardize the enterprise Data Warehouse.
  • Discussing with Data Governance team for Standardizing the Code all over the Organization and Creating Lookup tables to support the source and standard code Enterprise wide.
  • Worked with Governance team to get the metadata approved for the new data elements that are added for this project.
  • Design, develop, deploy and support end to end ETL specifications based on business requirements and processes such as source-to-target data mappings, integration workflows, and load processes
  • Manage daily activity of technical team(s) both onsite and offshore as needed.
  • Done Model Merge to keep one source of truth.
  • Proposed Remediation for the Data Quality in the models and fixing them.
  • Discussed on new UDP Properties and getting approvals from Data Governance team.
  • Experience in Data Analysis and providing approvals as part of Data governance Process.
  • Standardizing S3 bucket name prefixes.
  • Classifying S3 Buckets and its prefixes according to standards.

 

 

Data Architect /Modeler

Insurance and diversified financial services holding company.                                                                   Dec 2017 to April 2019

 

Roles & Responsibilities:

  • Architected and helped implement a new Data platform as part of Modernization project for Agency Model.
  • Up leveled Data Engineering team on AWS as well as lead the team to move towards Cloud Technologies.
  • Oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality.
  • Collaborate with project managers, address data related problems in regards to systems integration and compatibility, as well as act as a leader for coaching, training and providing career development to stakeholders
  • Migrating data from OIPA (Oracle Insurance policy administration) product to Data base.
  • Designing OLTP model for the database.
  • Designed Conceptual model as per ACORD Industry standards.
  • Created logical data model from the conceptual model and it’s conversion into the physical database design using ER studio and Erwin.
  • Identified objects and relationships and how those all fit together as logical entities, these are then translated into physical design using in ERSTUDIO and Erwin tool.
  • Applying naming Conventions to standardize the enterprise Data Warehouse.
  • Discussing with Data Governance team for Standardizing the Code all over the Organization and Creating Lookup tables to support the source and standard code Enterprise wide.
  • Worked with Architecture team to get the metadata approved for the new data elements that are added for this project.
  • Supporting Key Value pair designed database.
  • Worked on Data Profiling for Data Dictionary using Informatica IDQ.
  • Worked on Data Lineage to understand the flow of the data.
  • Analyzing the data to incorporate in the model.
  • Providing Mapping Specifications to ETL (Informatica) Developers.
  • Working on Performance Tuning techniques to improve performance.

 

 

Thanks & Regards,

KUMAR MAK

Zuven Technologies Inc

2222 West Spring Creek PKWY,

Suite 102, Plano, TX -75023

(M) 267-594-1520,  469-581-7797 (Desk)

Fax: 469 718 0405

Email: mak@zuventech.com

Website: www.zuventech.com

ALWAYS REWARD EFFORTS, LATER OUTCOME.

 

About Author

JOHN KARY graduated from Princeton University in New Jersey and backed by over a decade, I am Digital marketing manager and voyage content writer with publishing and marketing excellency, I specialize in providing a wide range of writing services. My expertise encompasses creating engaging and informative blog posts and articles.
I am committed to delivering high-quality, impactful content that drives results. Let's work together to bring your content vision to life.

Leave a Reply

Your email address will not be published. Required fields are marked *