Download Hemanth_Informatica

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Operational transformation wikipedia , lookup

Big data wikipedia , lookup

Expense and cost recovery system (ECRS) wikipedia , lookup

Data Protection Act, 2012 wikipedia , lookup

Entity–attribute–value model wikipedia , lookup

SQL wikipedia , lookup

Data model wikipedia , lookup

Data center wikipedia , lookup

Forecasting wikipedia , lookup

Clusterpoint wikipedia , lookup

PL/SQL wikipedia , lookup

Data analysis wikipedia , lookup

Information privacy law wikipedia , lookup

3D optical data storage wikipedia , lookup

SAP IQ wikipedia , lookup

Database model wikipedia , lookup

Data vault modeling wikipedia , lookup

Business intelligence wikipedia , lookup

Transcript
Hemanth Y
571-224-3486
[email protected]
SUMMARY:


8+ years of Progressive experience in System Analysis & Design, Development, Testing, Integration, and
Production Support using ETL tool INFORMATICA for Data warehousing on Client Server and Web-Enabled
applications.
Extensive experience in Data warehousing, Data Architecture & Extraction, Transformation and ETL data
load from various sources into Data Warehouse and Data Marts using Informatica Power Center
(9.5/9.0.1/9/8.6.1).

Involved in all aspects SDLC of ETL including requirement gathering, data cleaning, data load strategies,
mappings design & development, providing standard interfaces for various operational sources, unit/
integration/regression testing and UAT.

Expertise in using heterogeneous source systems like Flat files (Fixed width & Delimited), XML Files, CSV
files, IBM DB2, Excel, Oracle, Sybase, SQL and Teradata.
Proficient in designing &developing complex mappings from varied transformation logic like Unconnected
and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.
Worked on Repository Manager, Workflow Manager, Workflow Monitor and Designer to develop
Mappings, Mapplets, Reusable Transformations, Tasks, Workflows, Worklets to extract, transform and
load data.
Strong experience in writing PL/SQL Stored Procedures, Functions, Packages, Triggers and Performance
Tuning.
Enterprise Resource Planning (ERP) implementation specializing in BAAN ERP, SAP R/3 and SAP BW.
Integration of SAP R/3, BaaN ERP with other applications.
In-depth knowledge working on Oracle Database and Teradata.
Experience in using Automation Scheduling tools like Autosys, Control-M, Espresso, CA Work Load
Automation and Stone Branch.
Skilled in & good understanding of Parallel Processing, Star and Snowflake Schema, Dimensional
Modeling, Relational Data Modeling and Slowly Changing Dimensions.
Excellent interpersonal skills; comfortable presenting to large Groups, preparing written communications
and presentation material.
Flexible & Quick learner, who can adapt and execute in any fast paced environment.









EDUCATION:


M.S. in Electrical Engineering
B.E. in Electronics & Communications
TECHNICAL SKILLS:
ETL Tools
Data Modeling
Informatica Power Center 9.5/9.0.1/9/8.6.1 (Source Analyzer, Mapping
Designer, Mapplet, Transformations, Workflow Monitor, Workflow Manager)
E-R Modeling, Dimensional Modeling, Star Schema Modeling, Snow Flake
Modeling, Fact and Dimensions Tables, Erwin 9.5
Databases
Oracle 11g, 10g/8i/9i, SQL Server 2012/2008/2005,Salesforce Cloud,
Teradata, DB2
Big data Technologies
Sqoop, Flume, Mapreduce, Oozie, Hive, Pig
Scripting Languages
C, Python, Shell Scripting
OS
Windows 2000/XP/Vista, Windows Server 2003/2008,Unix, MS-DOS
Packages
MS Office suite 2010/2013, Adobe Photoshop 5, Flash 5
Other Software / Testing PL/SQL, Toad, MS Visio, MS exacta, ClearQuest, HP Quality Center, ALM,
Tools
Remedy, ServiceNow
PROFESSIONAL EXPERIENCE:
TCF BANK, MN
Role: Sr Informatica Developer
April 2014 – Till Date
Project: CEF (Customer Experience Feedback)
The Project is aimed to send the surveys to the customers after their transactions completion. CEF process
creates varied files of customer transactions for the branch, the contact center, customer support operations
and digital banking. NICE uses this file to trigger an email survey to the customers. NICE collects and analyzes
the survey responses and produces dashboards for the TCF. The dashboard is available to provisioned users
with customizable levels of visibility.
Responsibilities:













Analyzed database requirements in details with the BAs.
Developed complex mappings using reusable Mapplets, Mapping Parameters and Mapping Variables to
load data from different sources into staging tables.
Used Informatica Power Center Designer to create complex mappings using different transformations like
filter, Router, lookups, stored procedure, joiner, update strategy, expression, sequence generator and
aggregator through pipeline data to TouchPoint File.
Created a solution to develop the Employee hierarchy and Exception report which is required to generate
the touch point file.
Developed Oracle Stored Procedures to identify Exception records.
Worked on design and development of workflows to load data into staging, ODS and DataMart.
Developed mappings to create multiple Touch Point files (Branch, Contact Center & CSO).
Worked on various tuning issues and fine-tuned transformations to make them more efficient in terms of
performance.
Worked on Table Partitioning in Oracle and loaded/Retrieved data from Partitions.
Scheduled the Informatica workflows in Stone Branch Scheduler.
Prepared Scripts to perform file archiving process.
Provided in production support
Coordinated tasks with relevant teams for smooth transition from implementation to testing.
Environment: Informatica Power Center 9.5, Oracle 11g, MS SQL Server 2008/2005, Erwin 9.5, SQL Developer,
TOAD, Windows NT/2000, ALM
Deloitte, CampHill, PA
Role: Sr Informatica Developer
Mar 2013 - Mar 2014
Project: Department of Welfare
The main objective of the project was to release the new version of the existing data warehouse after
rectifying performance issues and with new enhancements. Used Informatica extensively for various ETL
processes and Oracle in order to make some database changes.
Responsibilities:
 Responsible for database development, dataflow, integration and Data Modeling.
 Used Erwin to develop conceptual models and create various Logical & Physical Models.
 Designed and developed Informatica Mappings and Sessions based on business user requirements and
business rules to load data from source to targets.
 Responsible for the data integration into SalesForce.com using Informatica PowerCenter/ Informatica
Cloud.
 Worked using Agile / Scrum methodologies.
 Created Informatica maps using various transformations like SAP BAPI/RFC, SAP IDOCs transformations,
Web services consumer, XML, HTTP transformation, Source Qualifier, Expression, Look up, Stored
procedure, Aggregate, Update Strategy, Joiner, Union, Filter and Router.
 Knowledge on Supply Chain Model (work areas-location, products and resources)
 Analyzed business data requirements to produce relevant ETL architecture deliverables (source analysis,
source to target maps, ETL flows) in a Data Warehouse environment.
 Created Data Design Documents/Conceptual Design Documents for this project.
 Effectively explained data related concepts at both IT and business user levels.
Environment: Informatica Power Center 9.0.5, Erwin 9.5, Power Designer, SQL Server 2012/2008, Html,
JavaScript, J-Query, SAP BAPI/RFC & IDOCs ,Goovy, Axeda, intelliJ, Salesforce, CA Workload Automation, UC4,
Visio, Windows 2000/XP/7
BCBS, Chicago, IL
Role: Informatica Developer
Aug 2012 – Feb 2013
The main objective of the project is to create mappings for 834 inbound and outbound, 270 outbound and
inbound and 271 outbound and inbound using serializes and parsers to create HIPAA 5010 files and to load the
data from the HIPAA 5010 files into the database for the health care client. Used Informatica extensively for
various B2B processes. Extensively used B2B transformations, Unstructured Data Transformation and XML
generator and XML Parser transformations and XML targets too.
Responsibilities:





Extensive use of B2B Data Transformation for handling vendor data, where it is in EDI, Unstructured data
and Complex structured data (XML Schemas).
Extensively worked with structured and unstructured data.
Worked with HIPAA 5010 for reduces risk and provides flexibility and complete bidirectional transaction
crosswalk transactions.
Involved in analysis, requirement gathering, and documenting functional and technical specifications.
Involved in the preparation of the mapping document for 5010 by identifying the minor changes from
4010.

















Involved in the installation of B2B plug-ins in the machine.
Designed the inbound mappings to load data from HIPAA 5010 834, 270, 271 and 271U files into Database
to healthcare industry standards.
Created the outbound mappings to generate HIPAA 5010 834, 271 and 270 files from the data in the data
bases.
Upgraded Informatica power center 8.1 to version 8.6 on servers.
Hands-On experience in developing Transformations like Joiner, Aggregate, Expression, Lookup (connected
and un connected), Filter, Union, Stored Procedures, Router, XML generator and parser, unstructured Data
Transformation etc. using best practices.
Configure the session so that power center server sends the e-mail when the session fails.
Extensive use of flat files as sources and targets depending on the inbound and outbound processes.
Deal with data files with lot of data (almost up to 6 million members in one file).
Involved in the performance tuning of the maps to reduce the runtime for the big files.
Extensively worked in the performance tuning of the programs, PL/SQL Procedures and processes.
Cleansing data using Trillium, RTRIM and LTRIM.
Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.
Run the mappings using Tidal 3rd party tool and implemented the concurrent running of the workflow for
different files at the same time.
Fixed the minor issues in the parser and serializers (built in java codes).
Build XML parser and serializer transformations with the xsd files.
Involved in the resolution of the issues in the built in java codes along with the informatica people through
GoToMeeting.
Passed the parameters to work flow from the tool Tidal directly to run the map.
Environment: Informatica Power Center 8.6.1 B2B DX/DT v 8.0, Oracle 1og, MS SQL Server 2008/2005, Erwin,
PL/SQL, Windows NT/2000,Autosys,XML, Espresso, CA Workload Automation.
Capital One, VA
Role: Sr Informatica Developer
Mar 2011 – Aug 2012
The project was aimed at making a data warehousing and BI Reporting system for customer and various policy
plans. The Project objective is to design and develop a Single Integrated Data Warehouse for reporting the
credit card data for the organization, which enables the credit card servicing Department to track the credit
and improve the overall performance.
Responsibilities:
 Used Informatica Power Center for extracting Source data and loading into target table.
 Created complex mappings using reusable Mapplets, Mapping Parameters and Mapping Variables to use
globally
 Developed mappings to extract data from various source systems and load them into Oracle targets.
 Designed and developed ETL processes using DataStage designer to load data into Oracle Database.
 Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets, Join,
Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the ETL Coding.
 Written SQL code to extract, transform, and load data ensuring compatibility with all tables and customer
specifications.
 Extensively worked with Repository Manager, Designer, Workflow Manager and Workflow Monitor.
 Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router,
Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.



















Created complex mappings which involved Slowly Changing Dimensions, implementation of Business
Logic and capturing the deleted records in the source systems.
Created workflows and worklets with parallel and sequential sessions that extract, transform, and load
data to one or more targets.
Worked with complex mappings having an average of 15 transformations.
Performed code migrations as part of weekly/Monthly Change request releases.
Extensively involved in Recovery process for capturing the incremental changes in the source systems for
updating in the staging area and data warehouse respectively.
Worked on various tuning issues and fine-tuned transformations to make them more efficient in terms of
performance.
Configured and used FTP by the Informatica Server to access source files.
Extensively used TOAD for ORACLE to test, debug SQL and PL/SQL Scripts, packages, procedures, triggers,
and functions.
Created and Scheduled Sessions, Jobs based on demand, run on time and run only once using Workflow
Manager.
Organized data in reports using Filters, Sorting and Ranking data with alerts.
Involved in performance tuning and optimization of existing Informatica mappings and sessions using
features like partitions and data/index cache to manage very large volume of data.
Extensively used Autosys for Scheduling the Sessions and tasks.
Performed Unit testing, Integration testing and System testing of Informatica mappings.
Involved in the preparation of documentation for ETL standards, procedures and naming conventions,
wrote Shell scripts.
Involved in data modeling Star Schemas according to the Business requirements using the Erwin 6.3.8.
Assisted & Collaborated in generation of a complete set of reports using Business Objects to target
different user segments.
Creation of different types of reports, such as Master/Detail, Cross Tab and Chart (for trend analysis). Use
of Prompts, filters, conditions, calculations etc.
Precise Documentation was done for all mappings and workflows.
Worked onsite and coordinated with off-shore resources and assisted for production stage.
Environment: Informatica Power Center 8.6, Datastage 8.0, DB2, Business Objects, Oracle 9i/10g, MS SQL
Server 2008, Erwin, PL/SQL, UNIX(solaris), Shell, Windows NT/2000,Autosys,XML.
ITSL, Gurgaon (IND)
Role: Informatica Developer
May 2008 – Aug 2009
This application was to build a Data Warehouse for the Employees of a HDFC BANK. Informatica power center
7.1.1 was used as an ETL tool to extract data from source systems and to load data in to target systems.
Responsibilities:



Worked with Business analysts for requirement gathering, business analysis, and testing and project
coordination. Research Sources and identify necessary Business Components for Analysis.
Involved in Relational modeling and Dimensional Modeling Techniques to design ER data models and Star
Schema designs.
Involved in creation of Data Warehouse database (Physical Model, Logical Model) using Erwin data
modeling tool.









Extracted the data from Oracle, Flat files and load into Data warehouse.
Used Informatica Designer to create complex mappings using different transformations like filter, Router,
lookups, stored procedure, joiner, update strategy, expression, sequence Generator and aggregator
transformations to pipeline data to Data Warehouse.
Created Tasks, Workflows, Sessions to move the data at using workflow manager and workflow monitor.
Involved in Performance tuning of various mappings and sessions to increase the performance.
Involved in Designing of the database and the tables and the relationship between the tables.
Wrote stored procedure to clean the existing data and created tables, views and indexes.
Involved in coding of database Triggers, Stored Procedures and Maintaining Integrity Constraints.
Involved in creating new stored procedures and modifying the existing ones.
Worked on improving the SQL Query performances.
Environment: Informatica Power Center 7.1.1, Power connect, UNIX (AIX 5.2), Toad, Windows NT, Erwin 3.5.2,
Shell Scripting, SQL*Plus, PL/SQL, SQL Navigator, Flat files, SAP, MS SQL Server 2000
REFERENCES: Available upon request.