Download Overview

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Sree
Email: [email protected]
Phno: 512-203-6784
Overview















Dedicated IT professional with Seven years of expertise in Software Development.
Involved in all phases of Software development Life Cycle with major focus on Data
Warehousing, Database Applications and Business Intelligence.
Extensively worked with Data Mart, used Star and Snowflake schema in relational and
multidimensional modeling
Excellent working knowledge of ETL concepts and distributed and centralized ETL
architectures
Excellent Technical knowledge of Ab Initio Software development environment and
components
Experienced in Transforming and loading data into warehouse tables using Ab Initio
GDE, automating the ETL process through scheduling and exception-handling routines.
Experience working with very large databases (2+ Terabytes) and large transaction
volume
Experienced in developing heavily parallel CPU bound ETL process jobs using Ab Initio
in Very Large Database Systems (VLDB) Environment.
Over 4 years of experience using Oracle 8i/9i, SQL, PL/SQL
Extensive knowledge in UNIX shell scripting (Korn Shell/Bourne Shell)
Experience in Sed (Stream Editor) and Awk scripting
Hands on experience with BI Reporting Tool Micro strategy 7.5.2
Experienced in applying Data Preprocessing techniques on very large data sets using
SPSS tool and implementing Data mining algorithms (ID3, Association Rules,
Clustering) on datasets using SAS Enterprise Miner and Shih data miner
Experience using Autosys for Job Scheduling
Excellent analytical, interpersonal, project management and Organizational skills
Motivated, enthusiastic with solid work ethic and cooperative personality
Technical Experience :
ETL Tools
Business Intelligence
Operating System
RDBMS
Ab Initio (GDE 1.13.17/1.13.3/1.12.7) (Co>op 2.13.17/2.13.3/2.12.7),
Informatica
Micro strategy (6.x /7.0)
Unix (Sun Solaris), Red Hat Linux, Windows95/NT/2000/XP
Oracle 9i/8i/8.0/7x, DB2, MY SQL, SQL Server 6.5/7.0, MS Access 2000
Wells Fargo, San Francisco - CA
Jan 05– Present
AB Initio Consultant
Wells Fargo launched a Stored Value program to provide card members of participating co-brand
credit card partners with a value-loaded payment instrument through the transfer of co-brand
funded or earned value. Rewards Redemption card is one type of stored value product that will
allow the transfer of card members earned rewards value from their co-brand credit card rewards
bucket onto stored value card. The purpose of launching this stored value card is to offer a
service that will give bank one partners card members a higher perceived value of their credit
card rewards.
Responsibilities:
 Developed good understanding of end user requirements and was involved in developing
functional and technical specifications of the requirements.
 Developed source watchers that look for incoming flat files (Delta) from other servers and
once found the required flat file will create indicators for the down streams that will
indicate availability of the file.
 Developed UNIX korn shell script that will validate header and footer of the available flat
file before processing the file.
 Used Filter-by-Expression to identify co-band credit card records based on operational
organizational code.
 Incorporated data parallelism into graphs by using partition by key and partition by
Round-Robin. Used partition by round-robin to avoid skew.
 Used Join component to identify inserts and updates.
 The table names, source system names etc were parameterized and these parameters
were used to generate separate .dat, .dml, .xfr files for each table – source system
combination. The graphs were thus made generic to run for each table – source system
combination without any changes.
 Phasing was done to the complex graphs to avoid the table lock situations while loading
and updating the tables.
 Did Performance Tuning of Ab Initio graphs using Various Ab Initio performance
techniques and best practices such as using Lookup’s instead of Joins and used in
memory sort where ever possible.
 Involved in monitoring Ab Initio jobs using AB_REPORT options.
 Involved in identifying dependency of jobs and according scheduled the jobs through
Tivoli.
 Developed several Unix Wrappers scripts to run Ab Initio graphs.
Environment: Ab Initio GDE 1.13.17 Co-op 2.13.17, HPUX11i, SQL Server, Oracle 9.2.2,
Tivoli, Test Directory, Harvest, Toad, Windows NT/2000
Providian Financial Corporation - CA
Dec 03 – Dec 04
Ab-Initio Developer
Providian Financial Corporation is a leading provider of lending and deposit products to
customers nationwide. Providian Financial is one of the ten largest issuers of credit cards and a
leader in developing innovative financial products. It provides a wide array of credit-related
services like Instant Credit Card Approval System, 24-hour online account access, Balance
Transfers, Credit Card Activation, Online Customer Servicing, Instant Rewards, Associate
Programs, Personal Shopper for convenient online shopping and more.
Responsibilities:
 Developed AB Initio Graphs for various ETL functions such as migrating data from
external sources like MVS and Windows NT servers by using FTP components and
loading storing the data on the Unix ETL server.
 Responsible for deploying Ab Initio graphs and running them through the Co-operating
systems mp shell command language
 Developed various Ab Initio graphs using Ab Initio components like Partition by key,
Partition by Round robin, Join, Rollup, Sort, Filter by expression, Gather, Merge,
Join with DB etc.,
 Involved in implementing techniques in improving recoverability and limiting resource
usage.
 Involved in Ab Initio Graph design and Performance tuning to Load graph process.
 Replicate operational tables into Staging Tables, Transform and Load data into
warehouse tables using Ab Initio GDE and responsible for automating the ETL process
through scheduling
 Used AB Initio Components like Sort, Partition, Rollups, Merge and created XFRs
 Using Partition Components (Partition by Key, by Expression, by round Robin) to
Partition the large amount of data files into a simple multiple data files
 Used Database Components (Input Table, Output Table etc.) to load/unload the data
to/from flat files
Environment: Ab Initio GDE 1.13.3 Co-Op 2.13.3, Unix, PL/SQL, Perl, Oracle 8i/9i, SQL
Server, Windows NT/2000.
National City Bank - OH
Ab Initio Consultant
Feb 03 – Nov 03
National City Bank is one of the leading banks in US. The firm is a leader in investment banking;
financial services for consumers and businesses, financial transaction processing, asset and
wealth management and private equity. I was an Ab Initio Consultant involved in a Next
Generation Database enhancement project to expand the depth and breadth of information
available about National City customers across channels, as well as to make that information
more easily accessible for all the database’s users and stakeholders.
Responsibilities:
 Analyzed user requirements, established business rules in order to build enterprise data
warehouse
 Created a 4 way Multi File Systems (MFS) to run graphs in parallel
 Designed Parallel Partitioned Ab Initio graphs using GDE Components for high volume
data warehouse
 Developed various Ab Initio graphs using Ab Initio components like Partition by key,
Partition by Round robin, Join, Rollup, Sort, Filter by expression, Gather, Merge,
Join with DB etc.,
 Responsible for Data Scrubbing (Data Cleansing)
 Reduced the amount of data moving through flows to have a tremendous impact on
graph performance
 Used file management utility like m_mkfs, m_rmfs, m_touch, m_ls, m_env
 Designed Data Marts and Data Warehouse using Star Schema and Snowflake Schema
in implementing Decision Support Systems
 Involved in implementing techniques in improving recoverability and limiting resource
usage
 Referenced files with parameters instead of hard paths
 Analyzed the source and target record formats and made necessary changes




Involved in Ab Initio Graph design and Performance tuning to Load graph process
Stress testing of ETL routines to make sure that they don’t break on heavy loads
Implemented the Parallel application by replicating the components- datasets and
processing modules into number of partitions
Identified and debugged the errors before deploying
Environment: Ab Initio Co>op 1.12.7, GDE 2.12.7, TOAD, Perl, UNIX Shell Scripts, SQL,
PL/SQL, Oracle 9.0.1, DB2, Erwin 4.0
Franklin Templeton Investments, CA
Feb 02 – Jan 03
Data ware housing Analyst.
A Fortune Mutual fund Company is a world giant in Mutual Funding and Investments. The
Information Management (IMG) division owns all Database servers & Informatica servers for
Enterprises. Beside IMG plays key role on analyzes, designs develop and deploy various
applications based on Business users interests for Oracle & ETL. Design, develop, deploy, and
maintain Data warehouse Applications.
Responsibilities:
 Preparing application design documents based on business specifications.
 Involved in Data Warehouse/Data Mart Dimensional Data modeling with Dimensions & Fact
Tables using Star Schema.
 Designed the ETL processes using Informatica to load data from Oracle, Flat Files (Fixed
Width), COBOL files and Excel files to staging database and from staging to the target Oracle
Data Warehouse database.
 Performed data cleansing, data manipulations using various Informatica Transformations.
Worked on Informatica Power Center tools - Source Analyzer, warehouse designer, Mapping
Designer, Workflow Manager. Repository manager administration.
 Created complex mappings and maplets using various transformations like Joiner,
Expression, Lookup, Stored Procedure, Aggregate, Filter, Update Strategy and Sequence
Generator etc.
 Involved in resolving performance bottlenecks for performance tuning of load process at
various stages such as targets, sources, mappings, sessions, and systems.
 Created indexes on Reporting tables and tuned the SQL Queries for better lead times of
Reports. Written PL/SQL Stored procedures to access data from Oracle.
 Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and
validations based on design specifications for unit testing, system testing, expected results,
preparing test data and loading for testing, error handling and analysis.
Environment: Oracle 8i, Informatica (Power Center 5.1), Business Objects, Erwin 3.5, PL/SQL,
Unix, Windows 2000.
Slipco Constructions (P) Ltd – India
Oracle Developer
Jan 00 – Dec 01
Slipco Constructions Ltd are the pioneers in Slipform Engineering with over two decades of
experience and expertise in the field of Slipform Construction. In spite of stiff global competition,
SCPL entered global market in the year 1996-1997and executed four Cement Silos in Bhutan, Sri
Lanka and Malaysia. I was involved in a project where the objective was to develop a system to
keep track about the construction company, its contracts, resources and employee service,
personal, salary details, contract activities, list of vendors, machine data and branch revenues.
Responsibilities:
 Analyzed the functional and technical specifications
 Designed and developed Master tables like chart of Accounts, Vendors, customer etc
 Developed Oracle database through effective use of Oracle SQL advanced technologies
such as SQL-Plus, Procedure Builder, and Oracle Forms/Report
 Generated Forms, Reports and Graphical reports using Oracle Report designer and Oracle
graphical designer tools
 Creating backup’s, restoring and recovering databases
 Created and maintained Triggers, Packages, Functions and Procedures
 Calculated and monitored size and space for tables, clusters and indexes
 Tuning sql statement for better performance
 Granting and Revoking Privileges
 Performed loading of data to the Oracle tables using SQL* Loader Scripts
Environment: Oracle 8.1.2, Windows 2000, MS Access, Excel, Reports 2.5,Developer 2000
Vysya Bank – India
Oct 98 – Nov 99
Oracle Developer / QA Analyst
Vysya Bank is one of the leading banks in India with strong and healthy fundamentals. The firm is
a leader in investment banking; financial services for consumers and businesses, financial
transaction processing. I was Involved in System Testing of 'Account Access', an Online
Banking Application developed in Java, HTML and Oracle, which enables Bank customers to
access their accounts through the Web. The Application provides the 'Account Summary'
(balances, last deposits, etc.) for various accounts and ability to transfer the funds between
various accounts using 'Fund Transfers'.
Responsibilities:
 Created, monitored and maintained Oracle databases from the specs provided
 Created database objects including tables, indexes, clusters, sequences, roles, and
privileges
 Analyzed the Business Requirements Document (BRD), created the test plan and
prepared detailed test cases for AUT
 Performed Black Box Testing to check the functionality and user requirements
 Executed the test cases manually using Test Director
 Monitored the scenario execution and identified the performance bottlenecks
 Prepared Bug reports
 Automated test scripts for functionality and regression testing using Win Runner
 Tested the application for Broken Links to ensure proper navigation
 Conducted cross-browser testing with IE and Netscape
 Performed testing of GUI consistency against different browsers (running on windows)
against various screen resolutions
 Developed test procedures and used various check points
 Documented the Daily Defects Status with the help of QA Metrics
Environment: Oracle 8i, Manual Testing, Win Runner, Test Director, Internet Explorer 5.0,
Netscape
Education

Bachelor of Science in Electrical and Electronics Engineering