Download Resume

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Expense and cost recovery system (ECRS) wikipedia , lookup

Entity–attribute–value model wikipedia , lookup

Big data wikipedia , lookup

Data Protection Act, 2012 wikipedia , lookup

SQL wikipedia , lookup

Data model wikipedia , lookup

Data center wikipedia , lookup

PL/SQL wikipedia , lookup

Clusterpoint wikipedia , lookup

SAP IQ wikipedia , lookup

Forecasting wikipedia , lookup

Data analysis wikipedia , lookup

Information privacy law wikipedia , lookup

3D optical data storage wikipedia , lookup

Database model wikipedia , lookup

Data vault modeling wikipedia , lookup

Business intelligence wikipedia , lookup

Transcript
ETL003
 Around Twelve years of work experience in the IT industry focusing on Data Analysis,
Application Design and Development, Data Modeling and Implementation of Data Warehousing
systems.
 Advanced knowledge and excellent concepts in ETL, Decision Support Systems and Data
Warehousing.
 Excellent Developer and Analyst in Informatica DW environment.
 Excellent with Informatica PowerCenter/ DataExplorer/ PowerExchange/ DTStudio 6.2, 7.1.3,
8.6.1, 9.5.1.
 Excellent with Informatica Data Explorer 9.0.1, PowerCenter Data Validation Option 9.5.2.
 Experienced in Data Modeling making use of Dimensional Data Modeling, Star Schema/
Snow Flake Schema, creating Fact and Dimension tables, Physical and Logical data
modeling.
 Good understanding of relational database environments.
 Experienced in managing ETL on a large data warehouse including the development,
implementation and ongoing support of automated load and validations processes.
 Involved in the overall architectural design, Strategy of on-going design and Migration from
development to production.
 Proven technical and analytical skills.
 Extensive programming experience in C++, C, JAVA and Shell Scripts.
 Expertise in full life cycle projects, from conceptualization to implementation and maintenance.
 Experience with Oracle 11g, 10g, 9i, 8i and 7.x on SCO UNIX, Sun Solaris and Windows 95/
98/ NT.
 Excellent in Greenplum database. Knowledge of Hadoop’s Distributed File System and Big
Data.
 Excellent in Informatica MDM, IDQ, IDE.
 Proficient in all phases of the System Development Life Cycle, including Requirements
Definition, Data Conversion, System Implementation, System Testing and Acceptance.
 Excellent oral/written communication skills, strong decision making skills, organizational skills,
analytical problem solving skills and a good team player.
TECHNICAL ENVIRONMENT
Operating Systems:
RDBMS:
Data Modeling
Dataloading Tools
Languages:
ETL
Application Packages
Web Programming
WINDOWS NT, WINDOWS XP/ 95/ 98/ 2000, IBM AIX 4.2/
4.3, Sun Solaris 2.6/ 2.7.
Oracle 11g/ 10g/ 9i/ 8i , TeraData 12.0, MS SQL Server 2005/
2008, Sybase, DB2, MS Access 2007.
Erwin 7.3.8 SP2/ 9.5, Visio 2007, UML.
Unix Shell Scripts, SQL* Loader, SQL* Plus.
C, C++, C#, Java, COBOL, PL/SQL and SQL.
Informatica 6.2/ 7.1.3/ 8.6.1/9.0.1/ 9.5.1.
VC++ 6.0, TOAD, Oracle SQL Developer 3.0.04.34, Visual Basic,
OpenGL, Trillium and Coldfusion.
XML, HTML, VB Script, Java Script, JSP, Macromedia.
MAJOR ASSIGNMENTS
Sr. ETL Developer/ Analyst,
TMG Health, Edison, New Jersey
(Feb 2015 – Present)
To fulfill the need for an integrated Medical Management platform coupled with traditional
Business Process Outsourcing (BPO) services, TMG will utilize Medecision’s SAAS based
software, Aerial, to bring a total solution to the market place. It involved the design,
development, unit testing, and delivery of the integrations necessary to exchange data with
the Aerial Care Management platform product. These integrations sourced data from
TriZetto® FACETS™ utilizing transport layers via the TMG Enterprise Data Warehouse
(EDW); leveraging the existing architecture.
















Responsibilities
Strong knowledge of design, architecture and development in SQL Server 2012 environment.
Significant experience interfacing with Business Analysts.
Gathering, analyzing requirements and Requirements Gathering sessions with Data Analysts.
Created Technical Spec Document, Data Mapping Document for the ETL Process.
Created Unit Test Documents for the ETL Code.
Tested the ETL objects to optimize Load performance.
Implemented CDC Loads using NRT change data capture for Real Time Processing.
Designed mappings between Facets sources to data warehouse in Greenplum.
Developed Workflows to Source SQL Server data.
GemFire data Ingestion.
Worked with Deployment team for Code deployment from Development to SIT, UAT and Production.
Excellent Issue Resolution and Results Turnover using ALM Defect Management.
Excellent Team work and understanding with the Users.
XML Files generated from the Greenplum EDW.
System documentation.
Enterprise Job Scheduling Software using ActiveBatch Workload Automation
Environment: Microsoft SQL Server 2012, Informatica PowerCenter 9.5.1, SQL Server
Management Studio, Visual Studio 2010, SQuirreL SQL Client 3.6 for Windows, Pivotal Gemfire
XD 1.4.0, Greenplum Database, PgAdmin III 1.16, PostgreSQL Tools 1.16.1, ActiveBatch,
Tortoise SVN Repository Browser, HP ALM Defect Management.
Sr. ETL Developer,
Unum, Portland, Maine
(Feb 2013 – Jan 2015)
Data Mart built for the Insourced Customer Reporting. Data sourced from Teradata database
and loaded into the DB2 Database. Worked from the conceptual level to Deployment phase
on the project involving Analysis, Data Architecture and Data Mart Population. The outcome of
the project is appreciated. Claim and Leave InSight will be an important differentiator in the
marketplace. It will be easily accessible through unum.com, which recently introduced
improved navigation, design and content. Upon entering the Claim & Leave InSight landing
page, users will see a customizable dashboard containing their key claim and leave data, as
well as a range of options for creating reports and analyzing trends. And because it’s built on
one of the industry’s largest disability and leave databases, Claim & Leave InSight users will
be able to benchmark their claim and leave trends against industries like their own.










Responsibilities
Strong knowledge of design, architecture and development in DB2 environment.
Significant experience interfacing with Business Analysts.
Gathering, analyzing requirements and Requirements Gathering sessions with Data Analysts.
Created Technical Spec Document, Data Mapping Document for the ETL Process.
Created Unit Test Documents for the ETL Code.
Tested the ETL objects to optimize Load performance.
Implemented CDC Loads using XML for Real Time Processing.
Designed mappings between Teradata sources to data warehouse in DB2.
Developed Workflows to Source Teradata data using Teradata Parallel Transporter.
Written Parameters and Shell Scripts to automate the Load Process and Job Control.
 Worked with Shared Services for Code migration from Development to Itest, Acceptance and
Production.
 Excellent Issue Resolution and Results Turnover.
 Excellent Team work and understanding with the Users.
 The project helped in achieving high level reports used by the Business Users and their teams.
 System documentation.
Environment: Informatica PowerCenter 9.5.1, PowerExchange CDC with SQL Server,
PowerCenter Data Validation Client 9.5.2.0, Teradata 14.00.03, IBM DB2 9.1.7, Teradata SQL
Assistant, DB2 AIX v9.7, DB2 Linux v10.1, WinSCP 5.5.4.
Sr. ETL Developer/ Analyst,
Guardian Life, New York City
(Jul 2011 – Jan 2013)
Data Mart built for the Actuarial Reporting Database. Data sourced from Mainframes DB2
database, Flat Files and loaded into the Oracle Database. Working on multiple projects with
tight timelines and have to do the deliverables on time. The outcome of the projects are
appreciated and the Actuarial Reporting database is being used by large number of Users
having high visibility across the company.















Responsibilities
Strong knowledge of design and development in Oracle and DB2 environment.
Significant experience interfacing with Business Users.
Gathering, analyzing requirements and preparing the Data Elements Spreadsheet.
Created Data Mapping Document for the ETL Process.
Tested the ETL objects to optimize Load performance.
Designed mappings between Flat files, DB2 sources to data warehouse, which is in Oracle.
Developed Slowly Changing Dimensions Type 2 and Type 1.
Created Parameter Files and Shell Scripts in Unix.
SQL and PL/SQL Coding.
Worked with Shared Services for Code migration from Development to QA and Production.
Automated Daily, Weekly and Monthly workflows to run different jobs.
Excellent Production Support and Issue Resolution.
Excellent Team work and understanding with the Users.
The project helped in achieving high level reports used by the Senior Management and their teams.
System documentation.
Environment: Informatica PowerCenter 9.0.1, IBM DB2 9.1.7, Oracle 11g, TOAD for Oracle
10.6, Erwin 7.3.8 SP2, TOAD for DB2 5.0, Putty 0.62, WinSCP 5.1.2, Beyond Compare 3.3.5,
SQL Server 2008 R2, IBM AIX UNIX 7.1.
Data Warehouse Developer,
ADP, Roseland, New Jersey
(Apr 2010 - Jun 2011)
For this project I was working at ADP for second time. Client360 and Sales360 are the
projects that were developed to create reports for the decision support systems and to monitor
the performance of the Clients and Sales over different divisions. The results from this project
were well appreciated.
Responsibilities
 Strong knowledge of design and development in Oracle 11g environment.
 Significant experience interfacing with DBAs, designers, developers.
 Gathering, analyzing and normalizing requirements, Source & target system data analysis (RDBMS,
hands-on SQL).
Identifying Data Quality issues and recommending resolutions.
Data Mapping, Data Extraction, Transformation & Load.
Worked closely with the business community to assess business needs, define requirements.
Tested and Modified the ETL objects to optimize Load performance.
Designed mappings between Flat files, RDBMS sources to data warehouse, which is in Oracle.
Wrote Functions, Triggers, Sequences and Stored procedures in Oracle Database.
Developed Slowly Changing Dimensions.
Generated DDL from Erwin and created the objects in Database.
Created Views and Materialized Views as required for the reports.
Unit Testing, Integration Testing of the developed objects.
Code migration from Development to QA and Production.
Automated Daily, Weekly and Monthly workflows to run different jobs.
Excellent Production Support.
The project helped in producing valuable reports, which had consolidated information for the business
owners.
 End user training and system documentation.














Environment: Informatica PowerCenter 8.6.1, Informatica Data Explorer 9.0.1, Oracle 11g, SQL
Server 2008, Windows.
Data Warehouse ETL Developer,
The Hartford, Hartford
(Jul 2008 – Mar 2010)
The primary objective of this project is to provide improved levels of service delivery to
customers, by effective analysis on data collected from different sources. The secondary
objective is to improve the management reporting and analysis process by providing a multidimensional analysis capability to help monitor the key business parameters for Customer
Service Division.
















Responsibilities
Analyze business requirements and create Functional Specification.
Generated business models and USE case analysis.
Translated functional requirements into technical specification.
Identifying Data Quality issues and recommending resolutions.
Data Mapping, Data Extraction, Transformation & Load.
Assist Customer development group with extracting and analyzing large complex data.
Member of Core team responsible for OLAP Data warehouse Implementation and Decision Support
and Data issue Resolution.
Custom development on UNIX server using PL/SQL, UNIX (Korn) shell scripting.
Tuning PL/SQL and SQL on very large data.
Written shell scripts for running batches.
Designed mappings between Flat files, RDBMS sources to data warehouse, which is in Oracle.
Extracted data from source systems to a staging database running on Teradata using utilities like
MultiLoad, TPump and FastLoad.
Created set of reusable transformations and Mapplets to create surrogate keys and to filter data, which
is coming from various sources
Used unconnected lookups in various mappings
The project helped in producing valuable reports, which had consolidated information for the business
owners.
End user training and system documentation
Environment: Informatica PowerCenter 8.1/8.6, Oracle 10g, TOAD, Teradata V2R6, Business
Objects, IBM AIX.
Data Warehouse ETL Developer,
ADP, Roseland, New Jersey
(Oct 2007 – Jun 2008)
Gathered data from different sources and developed an integrated Data Warehouse for the generation
of the Health of Sales Office reports. Analysis of the data for the business needs and requirements.
Unit testing of the developed ETL code and deployment into different environments. Reports generation
and evaluation using Crystal reports.





















Responsibilities
Designed and Customized data models for Data warehouse, supporting data from multiple sources.
Translated business requirements into technical specifications.
Experience in Extract, transform and load (ETL) tools to maintain, design, develop, test, implement and
document data warehouse solutions
Extensively worked with Informatica to load data from flat files, Oracle to target database.
Gathering, analyzing and normalizing requirements, Source & target system data analysis (RDBMS,
hands-on SQL), Identifying Data Quality issues and recommending resolutions.
Data Mapping, Data Extraction, Transformation and Load.
Involved in the overall architectural design, Strategy of on-going design and Migration from
development to QA and Production.
Data Integration Analysis.
Involved in the logical data modeling and physical data modeling using Erwin.
Written code to access different instances of different databases.
Responsible for the design and architecture of the ETL component of the data warehouse.
Worked on Informatica tools; Source Analyzer, Mapping Designer, Mapplets, and Transformations.
Created mapplets, stored procedures, functions and used them in different mappings.
Written triggers for data manipulation.
Written PL/SQL code and involved in performance enhancement.
Used Shortcuts to sources and transformations methodology in the Informatica environment.
Used parameters at session levels to tune the performance of Mappings.
Written test cases for data validation.
Documentation of technical specifications, user requirements, ETL Design Specifications, Mapping
Inventory.
Created Views and reports were generated from them.
The project helped in producing valuable reports, which had consolidated information for the business
owners.
Environment: Informatica PowerCenter 7.1.3, Oracle 10g, Oracle SQL Developer, Erwin 4.1.4,
Crystal Reports, Windows NT.
Data Warehouse ETL Developer,
MetLife, Hartford
(Jul 2005 – Sep 2007)
The project involved developing an integrated Data Warehouse system, to meet the demands of
detailed, targeted customer data and sales analysis. Role encompassed supporting existing
applications as well as developing new applications. Analysis of the developed data for the business
needs and requirements. Written test cases and involved team in the Testing of the final data. Involved
in the Operations side for the execution of Informatica jobs and Unix Shell Scripts at regular intervals of
time.
Responsibilities
 Designed and Customized data models for Data warehouse, supporting data from multiple sources.
 Modeled the Data Warehousing Datamarts using Star Schema and Warehouse using relational
concept.
 Translated business requirements into technical specifications.
 Experience in Extract, transform and load (ETL) tools to maintain, design, develop, test, implement and
document data warehouse solutions
 Extensively worked with Informatica to load data from flat files, Oracle, DB2, MS SQL server to target
database.
 Gathering, analyzing and normalizing requirements, Source & target system data analysis (RDBMS,
hands-on SQL), Identifying Data Quality issues and recommending resolutions.
 Data Mapping, Data Extraction, Transformation and Load.
 Strong knowledge of design and development in Oracle 10g environment.
 Significant experience interfacing with DBAs, Business customers, Business Analysts, Data Analysts,
Developers and IT Operations staff.
 Data Integration Analysis.
 Involved in the overall architectural design, Strategy of on-going design and Migration from
development to QA and Production.
 Responsible for the design and architecture of the ETL component of the data warehouse.
 Worked on Informatica tools; Source Analyzer, Mapping Designer, Mapplets, and Transformations.
 Created Informatica Repository in Oracle database. Designed mappings between flat file,
RDBMS source to data warehouse, which is in Oracle.
 Created set of reusable transformations to create surrogate keys and to filter data, which is
coming from various sources.
 Most of the transformations were used like Source Qualifier, Aggregator, Filter, Expression,
unconnected and connected Lookups, and Update Strategy.
 Managed data warehouse development, implementation and ongoing support of automated load and
validations processes.
 Improved the mapping performance using SQL.
 Created Mapplets and used them in different mappings.
 Optimized query performance, session performance.
 Customized data by adding calculations, summaries and functions.
 Created database triggers for data security.
 Write SQL, PL/SQL codes and stored procedures.
 Used parameters and variables at mapping and session levels to tune the performance of Mappings.
 Designed and documented the validation rules, error handling and test strategy of the mappings.
 End user training and system documentation.
 Tuning PL/SQL and SQL on very large data.
 Documentation of technical specifications, user requirements.
Environment: Informatica Power Center 6.2/ 7.1.3, Oracle 10g/ 9i, Windows NT, Trillium, IBM AIX 4.3
Data Warehouse ETL Developer,
Travelers Life & Annuity, Hartford
(Oct 2004 - Jun 2005)
The project involved building Operational Data Source and Datamart. Data was sourced from DB2 and
flat files. Informatica tools were used to extract source data from flat files and populate the Datamart on
an Oracle database. Building the Datamart involved designing the architecture of the
Extract/Transform/Load environment and development of transformation standards and processes.
Responsibilities
 Implement Data warehouse processes like Data Access Paths, Data Extraction/Transformation
and Load ETL, logical and Physical data modeling using Erwin, Data Mapping Loads, and
source to Target Reconciliation.
 Work closely with the business community to assess business needs, define requirements and
implement solutions within the data warehouse.
 Assist Customer development group with extracting and analyzing large complex data.
 Member of core team responsible for OLAP Data Warehouse Implementation and Decision
Support and Data issue Resolution.
 Extensive experience in maintain, design, develop, test, implement and document data
warehouse solutions.
 Used Unix Shell Scripts to execute different jobs and perform operations.
 Custom development on Unix server using PL/SQL, Unix Korn shell scripting, SQL* Loader,
Testing and modifying the ETL procedures to optimize Load performance.
 Implemented Performance Tuning concepts in Informatica.
 Written shell scripts for running batches.
 Extensive experience interfacing with DBAs, Business customers, Developers.
 Used the Repository manager to give permissions to users, create new users and repositories.
 Extracted data from COBOL using normalizer transformation where DB2 was the source data.
 Involved in Data Cleansing, especially converting the data formats.
 Strong knowledge of design and development in Oracle 9i environment.
 Modified the existing batch process, shell script, and PL/SQL Procedures for Effective logging of
Error messages into the Log table.
 Working with Oracle 9i environment using multiple instances, parallelism and partitioning
 Extracted data from different systems into Repository using Informatica PowerConnect.
 Used different Data Warehouse techniques like Star schema, Snowflake schema.
 Incorporated traditional ER model with Star schema to resolve the extra information needed.
 Worked on Flat files, which are coming from Mainframe sources using Normalizer to load the
data into Oracle Database.
 Created different source definitions to extract data from flat files and relational tables for
Powermart.
 Created ETL transformations for applying the key business rules and functionalities on the
source data.
 Created Sessions and Batches using Informatica PowerCenter Server.
 Written PL/SQL procedures for processing business logic in the database. Tuning of SQL
queries for better performance.
 Interaction with the end users in requirements gathering and designing technical and
functional specifications.
 Database Extraction and filtering using Unix Shell Scripts (awk, sed). Shell Scripts were run
through UNIX Cron scheduling Batch sessions.
Environment: Windows NT, Oracle 9i, DB2, Informatica 6.2 PowerCenter/ PowerMart,
UNIX (Sun Solaris).
EDUCATION
 Bachelor of Technology in Computer Science, Jawaharlal Nehru Technological
University. GPA 3.8
 Masters Degree in Computer Science, Oklahoma City University. GPA 3.8