Get C2C/W2 Jobs & hotlist update

Saveetha – Informatica Lead Developer –  10+ years Exp Current Location : Cerritos,CA – Ready to go On-Site

Saveetha – Informatica Lead Developer –  10+ years Exp – H4 EAD – Current Location : Cerritos,CA – Ready to go On-Site

Consultant's Details: 

Consultant Name: Saveetha

Visa Status: H4-EAD

Current Location : Bay Area,CA

 

Employer Details:

Employer:Nextgen Technologies Inc

Contact Person:Kushal

Email:kushald@nextgentechinc.com

Note: Please call between 09:30 AM PST to 06:00 PM PST

Phone: 4134240484

Saveetha 's Resume

Professional Summary:

Over 10+ years of IT experience in analysis, design, development, testing, maintenance, Production support and implementation of complex Data Warehousing applications using ETL tool Informatica Power Center, IICS/IDMC Informatica Cloud (CDI and CAI), Cognos and Business objects for various industries like Retail, pharmaceutical and healthcare.

 

  • Experience in Requirement Analysis, Design, Development, Testing, Implementation, and Production Support of Data Warehousing and Data Integration Solutions using Informatica Cloud and Informatica PowerCenter.
  • Experience working with Data Extraction, Transformation, and Loading of data from different heterogeneous sources to Data Warehouse.
  • Experience integrating data to and from On-premises databases and cloud-based database solutions using Informatica intelligent cloud services IICS.
  •  Experienced with designing Informatica Cloud Mappings, Task flows, Data Synchronization, Data Replication, Mapping and Mapping Configuration tasks, and PowerCenter task creation.
  • Design, develop, and maintain Informatica cloud data integration processes using the appropriate data load technique including Push-Down Optimization for performance optimization.
  •  Having technical hands-on experience in Informatica Intelligent Cloud Services (IICS) primarily in the Application Integration module.
  • Proficient use of Informatica Power Center tools (Designer, Workflow Manager, Workflow Monitor, Repository manager) and databases like Oracle and SQL Server
  • Expertise in developing and running Mappings, Sessions/tasks, Workflows, Worklets and Batch processes using Informatica PowerCenter 8.x.,9.x,10.4.1.
  • Experience in creating Reusable Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Strong work experience in all phases of development including Extraction, Transformation and Loading data from various sources like Database, Flat files, Snowflake, Salesforce into Datawarehouse and Data Marts using IICS/IDCM Informatica Cloud(CDI and CAI)
  • Created various mappings, Task and Task flows based on the requirements in CDI(cloud Data Integration)
  • Over 2 years of experience in IICS Informatica Cloud using CDI and CAI.
  • Experienced in designing and developing ETL processes to read and write data to cloud application like Salesforce, Snowflake and AWS in a way of Cloud Computing.
  • Implemented SCD Type1 and SCD Type 2 and Incremental Logic (CDC- Change Data capture) concepts.
  • Experience in performance tuning techniques.
  • Experience in identifying the bottlenecks in ETL process and Performance tuning of applications using Database Tuning, Partitioning, Index Usage, and Aggregate tables and session portioning.
  • Worked in production support team for maintaining the mappings, sessions, and workflows to load the data in Data warehouse.
  • Involved in various deployments of Informatica components using Deployment Groups.
  • Extensively worked in Change, Incident, Problem and Request Managements using Service Now.
  • Experience in integration of various data sources like Oracle, SQL Server, MS Access and non- relational sources like Flat files and XML files into staging area.
  • Strong understanding of Data warehouse concepts (Star schema, snowflake schema, facts and dimensions) and SDLC methodologies.
  • proficiency with Unix scripting.
  • Good knowledge and working experience in scheduling and monitoring the jobs using Control M.
  • Experience with Cognos Tools such as Cognos 10.x/8.8 BI suite (Framework manager, Cognos connection, Report Studio, Analysis Studio, Query Studio, Metric Studio and Event Studio) Cognos Report net 1.1 and Analysis Tools such as Cognos Power play transformer 8.x,
  • Experience in creating reports using Business Object XIR2,
  • Expertise in Framework manager modeling which includes creating relational and dimensional models. (Database layer, Business Layer, presentation layer and Packages)
  • Created security groups and implemented data level, object level and package level security.
  • Expertise in developing Multidimensional Cubes using Cognos PowerPlay Transformer.
  • Created simple to Complex Reports using Report studio (Drill through, Drill down, Master detail, Report level Joins, conditional formatting, scheduling and bursting the reports) and Analyzed data using Analysis Studio.
  • Experience in developing standard and complex dashboards and implementing dynamic grouping using Report Studio.
  • Experience in migrating reports from Impromptu to Report Net, and upgrading reports from Report Net to Cognos 8 BI and upgrade previous version to latest versions (Cognos 8.4 to 10)
  • Involved in Installation and Configuration of Cognos 8 BI (All components including Power Play and Framework Manager) in Distributed Environment (Development, Test and Production)
  • Extensive experience with database languages such as SQL, and PL/SQL which includes writing queries, Stored Procedures, Functions and View Experience in databases like Oracle 10g/9i/8i MS SQL Server 2005/2000.
  • Expertise in Business Object reports and Designer. Created various Webi and crystal reports and involved in various support activities.
  • Excellent team player and self-starter with good ability to work independently, proficient at developing& implementing User Training programs and possess good analytical, problem solving and logical skills with ability to meet deadlines, multitasking and flexible in work schedules.
  • Excellent communication skills.

 

 

Technical Skills

 

 ETL Tools

 

Informatica Cloud Intelligent Services (IICS), Informatica PowerCenter 10.4/10.1/9.6

 

Database

Oracle 21c, SQL Server

BI Reporting Tools

Cognos 10.x, Cognos 8.x, Report Studio, Query Studio, Analysis Studio, Framework Manager, Cube Designer, Cognos Report Net

Cloud Technology

Snowflake, AWS S3, Glue, AWS Lambda, Google Cloud Platform (GCS, BigQuery)

Scheduling Tools

Control-M, Autosys, Informatica Scheduler

Scripting languages

Unix Shell, Python

Programming skills

C, C++, HTML, SQL, PL/SQL

Other Tools

Jira,Git, Slack, Confluence, ServiceNow, Change Management, SQL Developer, DBeaver,Toad 8.4,Dbeaver 7.2 Jenkins, ITSM, Office365,

 

 

Professional Experience:

 

Employer         : Nextgen Technology Inc, San Jose, CA

Client               : Neiman Marcus, Dallas, TX 

Role                 : Informatica Production support Analyst and Lead developer

                                                                                                                                    Oct 2021– Till date

 

 

Description: Neiman Marcus Group, Inc. is an American integrated luxury retailer headquartered in Dallas, Texas, which owns Neiman Marcus, Bergdorf Goodman, Horchow, and Last Call.  I am part of the Integration Competency Center (ICC) which provides data / system integration, or enterprise application integration within Neiman Marcus Group by combining the fragmented data across disparate systems to create a consistent view of the core information and leverage across the enterprise to drive the business decisions.​

 

Key Objectives:​

  • Access internal NMG systems and the third-party vendor servers to fetch/send the business data for further processing. Act as an intermediate layer in transferring the feeds required for Datamart and other critical interfaces. ​
  • Owns the data integration tool (Informatica) by maintaining and upgrading the application health to avail the Informatica Corporation services. ​
  • Promotes developed code to Production instance for UNIX, ETL and AWS Components. ​

 

Roles and Responsibilities:

 

  • Monitor the critical job flow and handle the failures. Fixing job failures by analyzing the job scripts and logs.
  • Interact with business users and third-party vendors on a regular basis to discuss day to day issues on a regular basis.
  • Involved in performing regular health checks to ensure the accuracy of various environments and applications along with Control M batch processing and troubleshooting using Informatica workflow and session logs, UNIX, Python and database SQL process.
  • Strong knowledge of Change, Incident, Problem management.
  • Responsible for complete change management and deploying the Informatica, Unix and DB components to production.
  • Closely working with the business users on the issues for quick resolution.
  • Engage respective teams, initiate the bridge call and triage the issue in case of any P1 or P2 issues.
  • Checking the file transfer activity of FTP/SFTP using Putty and checking the upload/ download process of the using AWS. Verifying and changing the file permissions.
  • Enhancement and Production bug fix.
  • AWS deployments through slack.
  • Monitoring the AWS Glue job and Lambda function failures and fixing the issue.
  • Handling major Incident management, change, problem managements.
  • Involved in analysis and ad hoc request with the business users.
  • Coordinate with Dev team on the production change and reviewing the code change.
  • Involved in SOX meetings and preparing SOX reports for all the production changes.
  • Documenting the resolution of the known issues and responsible for updating and maintaining the SOP (Standard Operating Procedure) in knowledge base.
  • Designed, Developed, and Implemented data pipelines using Informatica Cloud IICS.
  • Developed IICS Application Integration components like Processes, Service Connectors, and process objects.
  • Worked on Service connectors for real-time integration with third-party applications.
  • Demonstrated success in building RESTful API Integration using IICS.
  • Processed XML messages in real time and push to AWS S3 Bucket and Snowflake.
  • Developed Data Integration components with Snowflake, Aure Blob Storage, and AWS S3.
  • Developed Data Synchronization, Data Replication, Mapping, and Mapping Configuration tasks.
  • Responsible for determining the bottlenecks and fixing them with performance tuning.
  • Identified data issues and worked with the Business team to resolve the issue.

 

 

Environment: Informatica Power Center 10.4.1, IICS(Informatica Cloud Service),Oracle 12CR1, Control -M 9.0.19.200, ServiceNow, AWS S3, Glue studio, Snowflake, Unix, Putty, and Jira.

 

Employer         : Nextgen Technology Inc, San Jose, CA

Client               : Shell, Houston, TX                                                                               Jan ‘18 – Oct’ 21

Role                 : Informatica Lead Developer

                                                                                                                                        

Description: Shell is the largest oil and energy utility company headquartered in Netherlands and incorporated in the United Kingdom. Shell is the 4th largest company in the world and one of the business groups manages upstream business in North and South America. This is a support project and as Informatica Lead, I was responsible for creating ETL mapping for loading the data and providing reports to the users to review and analyze their ticketing information through Cognos reporting and more detailed analysis via Cognos cubes.

Roles and Responsibilities:

  • Responsible for Business Analysis and Requirements Collection.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager    and Workflow Monitor.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources in real time.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Developed mapping parameters and variables to support SQL override.
  • Created mapplets to use them in different mappings.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Used existing ETL standards to develop these mappings.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
  • Modified existing mappings for enhancements of new business requirements.
  • Used Debugger to test the mappings and fixed the bugs.
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.

 Environment: Informatica Power Center 9.x, Cognos 10, Cognos 8.4-Framework Manager, Cube Designer, Report Studio, Query Studio, Analysis Studio, Oracle 10 g, Data stage, Kalido.

 

Employer         : Wipro Technologies, Bangalore, India

Client               : Shell                                                                                                     May ‘10 – Dec’12

Role                 : Informatica Lead

 

Description: Shell is the largest oil and energy utility company headquartered in Netherlands and incorporated in the United Kingdom. Shell is the 4th largest company in the world and one of the business groups manages upstream business in North and South America. This is a support project and as Informatica Lead, I was responsible for loading the data from source to target using informatica Powercenter and also provide reports to the users to review and analyze their ticketing information through Cognos reporting and more detailed analysis via Cognos cubes.

Roles and Responsibilities:

  • Responsible for Business Analysis and Requirements Collection.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Developed mapping parameters and variables to support SQL override.
  • Created mapplets to use them in different mappings.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Used existing ETL standards to develop these mappings.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loaded it into data warehouse.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
  • Modified existing mappings for enhancements of new business requirements.
  • Used Debugger to test the mappings and fixed the bugs.
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.

Environment: Informatica Power Center 8.6.1, Cognos 10, Cognos 8.4-Framework Manager, Cube Designer, Report Studio, Query Studio, Analysis Studio, Oracle 10 g, Data stage, Kalido.

 

Employer         : Infosys, Bangalore, India

Client               : Pfizer, NYC, New York                                                                          Jan ‘10 – Apr ‘10

Role                 :  Sr Informatica Developer

 

Description: Pfizer is one of the largest pharmaceutical companies in the world. Pfizer develops and produces medicines and vaccines for a wide range of medical disciplines. It was a development cum support project provided support for 23 countries. As a Cognos developer I was involved in developing Cognos reports, developing models using Framework Manager and providing support to the end users.

Roles and Responsibilities:

  • Logical and Physical data modeling was done using Erwin for data warehouse database in STAR SCHEMA
  • Using Informatica PowerCenter Designer analyzed the source data to Extract & Transform from various source systems (oracle 10g, DB2, SQL server and flat files) by incorporating business rules using different objects and functions that the tool supports.
  • Using Informatica PowerCenter created mappings and mapplets to transform the data according to the business rules.
  • Used various transformations like Source Qualifier, Joiner, Lookup, sql, router, Filter, Expression and Update Strategy.
  • Implemented slowly changing dimensions (SCD) for some of the Tables as per user requirement.
  • Developed Stored Procedures and used them in Stored Procedure transformation for data processing and have used data migration tools
  • Documented Informatica mappings in Excel spread sheet.
  • Tuned the Informatica mappings for optimal load performance.
  • Created and Configured Workflows and Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • This role carries primary responsibility for problem determination and resolution for each SAP application system database server and application server.
  • Worked along with UNIX team for writing UNIX shell scripts to customize the server scheduling jobs.
  • Constantly interacted with business users to discuss requirements.

Environment: Informatica Power Center 8.x, Oracle 10 g, Windows XP, SQL, TOAD.

 

Employer         : Infosys, Bangalore, India

Client               : Adidas, Japan                                                                                       Jan ‘08 – Dec ‘09

Role                 : Cognos Developer

 

Description: Adidas is the largest company for sporting goods and apparel. This project dealt with development and testing of Cognos reports and power play cubes.  This project involved in development of various Cognos reports and Framework model files. The reporting solution was related to revenues and profitability of products for various geographical regions.

 

Roles and Responsibilities:

 

  • Implemented Framework Manager Security for Data access, Object level access and Package access based on the Organization Level.
  • Designed models & cubes in Cognos Transformer using existing IQD’s from Framework Manager
  • Modeled metadata from Oracle data source for use in Report studio and Query Studio using Framework Manager.
  • Created various List Reports, Cross tab Reports and Drill through Report in Cognos Report Studio. Created model files in Framework manager.
  • Designed Reports using most of the Report studio features (Conditional Formatting, Conditional Page layout, Sections, Page Breaks, Master-detail, Drill Through, Drill Down, Drill Up).
  • Worked on Analysis Studio to develop multi-dimensional reporting.
  • Done complete testing of Cognos Reports and validated data using Database.
  • Modified the existing report as per user request and provided support to the user.
  • Fixed the issues with Cognos reports and cubes.
  • Migrated reports from Impromptu to Cognos 8.
  • Documented functional and technical requirements for the metadata modeling and BI reports

 

Environment: Cognos 8(Framework Manager, Cognos Connection, Query Studio Report Studio, Analysis Studio, Powerplay, Windows XP, SQL Server 2005,

 

Employer         : Infosys, Bangalore, India

Role                 : Cognos developer                                                                            Aug ‘06 – Dec ‘07

Client               :   Gap, USA

 

       Description: Gap is one of the largest retailers for clothing and accessories headquartered in San Francisco, California. This project was a development project which involved Development of Cognos Reports.

Roles and Responsibilities:

  • Created various reports using Cognos Report Studio.
  • Interacting with the business users to gather business requirements and designing the functional and technical requirement documentation.
  • Implementing the script provided by Cognos support to perform different functionalities.
  • Developing transformer models using power play transformer (OLAP) and also developing models using Cognos 8.3 / Cognos 8.4 / transformer.
  • Modifying power cubes by adding new dimensions, levels and measures etc.
  • Publishing cubes directly from Cognos 8.3/Cognos. 8.4 / transformer using new publish functionality.
  • Provided Cognos reports as per user request.
  • Modeled metadata from Oracle data source for use in Report studio and Query Studio using Framework Manager.
  • Worked with development and project management team for early defect fixes.
  • Provided dashboard reports to the users on a weekly basis.
  • Refreshed daily data by running the scripts in Toad.
  • Implemented Security with LDAP and Group level security in Cognos Connection Portal.
  • Used Cognos Connection for scheduling and bursting reports across the Business segments and providing additional security to published cubes as well as reports.
  • I was also involved in Development Requirement gathering, Development and testing of Informatica mappings and Business Object reports.

Environment:

Cognos 8.4, Oracle 10 g, Windows XP, SQL, TOAD, Business Object XIR2, Informatica Power center 8.1

Education qualification:

Bachelor in computer science and engineering, Madras University, Chennai, India.

 

 

 

Note: Please call between 09:30 AM PST to 06:00 PM PST

Kushal 

| 1735 N 1St ST., Suite 308 |San Jose, CA 95112

NextGen Technologies Inc

Email: kushald@nextgentechinc.com. Website: www.nextgentechinc.com | 4134240484 |

To unsubscribe from future emails or to update your email preferences click here

About Author

JOHN KARY graduated from Princeton University in New Jersey and backed by over a decade, I am Digital marketing manager and voyage content writer with publishing and marketing excellency, I specialize in providing a wide range of writing services. My expertise encompasses creating engaging and informative blog posts and articles.
I am committed to delivering high-quality, impactful content that drives results. Let's work together to bring your content vision to life.

Leave a Reply

Your email address will not be published. Required fields are marked *