Download Bhupesh_Java-hadoop_8Years_resume

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Entity–attribute–value model wikipedia , lookup

Data model wikipedia , lookup

Data center wikipedia , lookup

Data analysis wikipedia , lookup

Information privacy law wikipedia , lookup

Clusterpoint wikipedia , lookup

Semantic Web wikipedia , lookup

3D optical data storage wikipedia , lookup

Web analytics wikipedia , lookup

Data vault modeling wikipedia , lookup

Java ConcurrentMap wikipedia , lookup

Database model wikipedia , lookup

Business intelligence wikipedia , lookup

Transcript
Bhupesh Hanumanula
(848) 213-2342
[email protected]
Summary:
 Having 8+ years of professional experience in the field of Information Technology including
more than three years of experience in Big Data Technologies. Practical experience in building
industry specific Java/J2EE applications and implementing Big Data technologies such as
Apache Hadoop.
 Experience in Hadoop Development/Administration built on years of experience in Java
Application Development.
 Experience in design and development of Map Reduce Programs using Apache Hadoop for
analyzing the big data as per the requirement.
 Proficient in programming OOPS and the knowledge of Hadoop components Hive, Pig, Sqoop.
 Experience in Core Java Development (Such as Collections, Data Structures, and String
manipulations)
 Proficiency in Big Data and Technologies like HDFS, MapReduce, Hive, PIG, OOZIE, Kafka,
Spark.
 Good Knowledge in Data structures and algorithms.
 Good Knowledge on Hadoop Cluster architecture and monitoring the cluster.
 Experience working on NoSQL databases including HBase and data access using HIVE.
 Good experience on MapReduce and Design patterns on MapReduce.
 Extensive experience in MVC (Model View Controller) architecture, design, development of
multi-tier enterprise applications for J2EE platform using Java, Struts, JDBC, Tag Libraries,
Hibernate, Spring and XML.
 Strong front-end UI development skills using scripting languages like JSP, HTML, JavaScript,
JQuery and CSS.
 Experience with Hadoop Big Data Installation and development.
 Scheduling, monitoring job workflows and identifying failures with Oozie and integrating jobs
 With Zookeeper.
 Developed Web Service and inter process communication applications using Java SOAP
Framework, JSON and REST API’s.
 Experience in handling and writing XML/XSL files with JAXP (SAX, DOM,STAX).
 A team player with, Strong programming and analytical skills.
Technical Skills:
Big Data Technologies
HDFS, Map Reduce, Hive, Pig, Tez, HBase, Sqoop, Cloudera CDH3,
CDH4, CDH5, Hadoop Streaming, ZooKeeper, Oozie, Flume, HUE,
Impala and Spark.
No SQL
Programming Languages
Scripting Languages
Cloud Computing
Web Technologies
Database Platforms
HBase, Cassandra , Mango DB
Java, Pig, HQL, Spring MVC , Hibernate
Shell, PigLatin
Open Stack , Amazon AWS
Html, Java Script, XML, JavaScript, Servlets, JSP
MySQL, Oracle 11g/10g/9i, SQL Server 2012/2008
Operating Systems
IDE
SERVERS
Software Applications
Windows, Red Hat, Ubuntu, Mac OS X.
Eclipse, Net Beans.
Apache Tomcat, Web Logic.
JUnit, TOAD SQL Client, MySQL Workbench, WinSCP, Putty, MSOffice.
Professional Experience:
Nielsen, New York City, NY
Apr2016-Till Date
Sr. Hadoop Developer
Project Title: Harmonization
Description: We deal with analyzing users’ digital data which is collected from internet, when a user is
either watching a video or browsing through an application, the user’s data is analyzed and
recommendations are provided to organizations with statistics.
Responsibilities:
 Designed and developed Hadoop Map Reduce Jobs and Oozie Workflows.
 Experienced in pulling log files using Flume into HDFS.
 Analyzed clickstream data from webservers logs which were collected by Flume.
 Handled importing of data from various databases using Sqoop.
 Performed transformations of data according to business requirements using Hive/HQL,
MapReduce, Pig and loaded data into HDFS.
 Involved indifferent Phases of the Project from requirements, development and deployment
stages of the project.
 Worked with Java/Scala and XML developers and systems engineers.
 Designed user interface for users to interact with system using JQuery, JavaScript, HTML, CSS,
JSON, JSP, JSP Tag libraries, Spring Tag libraries.
 Introduced effective ways of communication and stream-lined use of JIRA, Confluence and to
maximize productivity and bring discipline between Dev-QA communication gaps.
 Created the Data Pipeline of Map Reduce programs using Chained Mappers
 Implemented complex Map Reduce programs to perform joins on the Map side using
Distributed Cache in Java.
 Involved in creating Design documents.
 Implemented Partitioning and Bucketing in HIVE for more efficient data access.
 Involved in analyzing and improving the performance of the Map-reduce and Hive queries.
 Involved in complete lifecycle of different Hadoop implementation tasks gaining practical
experience in writing MapReduce Programs
 Developing Spark applications using Scala for easy Hadoop transitions in future.
 Used Kafka for processing logs on the cluster as a proof of concept.
 Involved in writing persistence objects using Hibernate.
 Created RESTFUL Web services to expose the data to transactional systems..
Environment: Hadoop, Cloudera distribution Sqoop, Flume, Java, Kafka, MapReduce Program, Oozie,
HDFS, Hive, Pig, HBase, MySQL, Java, SQL, Eclipse, , Spark and Scala.
GFK, Manhattan, NY
Jan2015-Mar2016
Hadoop Developer
Project Title: GFK ATLAS
The GFK Group is an international market research organization providing services in the three sectors
Custom Research, Retail and Technology and Media. The company's headquarters are in Nuremberg,
Germany. GFK offers information and consulting services in three sectors to companies in the consumer
goods and pharmaceuticals industry, retail, media and the service sector: product policy, logistics and
sales, and marketing and advertising.
Responsibilities:
 Planned, installed and configured the distributed Hadoop Clusters.
 Ingested data using Sqoop to load data from MySQL to HDFS on regular basis from various
sources.
 Configured Hadoop tools like Hive, Pig, HBase, Zookeeper, Flume, HBase, Impala and Sqoop.
 Built relational view of data using HCatalog.
 Ingested data into HBase tables from MySQL, Pig and Hive using Sqoop.
 Wrote Batch operation across multiple rows for DDL (Data Definition Language) and DML (Data
Manipulation Language) for improvised performance using the client API calls
 Integrated MapReduce with HBase with HBase serving as both Data Sink and Data Source.
 Grouped and filtered data using hive queries, HQL and Pig Latin Scripts.
 Good in provisioning and deployment tools like Puppet.
 Automated the cloud deployments using AWS Cloud Formation Templates.
 Queried both Managed and External tables created by Hive using Impala.
 Implemented partitioning and bucketing in Hive for more efficient querying of data.
 Created workflows in Oozie along with managing/coordinating the jobs and combining multiple
jobs sequentially into one unit of work.
 Worked on setting up the life cycle policies to back the data from AWS S3 to AWS Glacier
 Worked with Amazon IAM console to create custom users and groups
 Worked with various AWS EC2 and S3 CLI tools
 Server Configuration management using Puppet.
 Designed and created both Managed/ External tables depending on the requirement for Hive.
 Expertise in writing custom UDFs in Hive.
 By Using AWS allowed me to reduce costs for the department and eliminate unwarranted
resources.
 Automated installing the Linux packages and administering Linux service using puppet, and
automating product installation configuration.
 Used Pig’s svn repository of user-contributed functions.
 Integrated Hive tables with visualization tools like Tableau and Microsoft Excel.
Environment: Cloudera distribution CDH4, Puppet, Hadoop/YARN, Linux, Hive, AWS, Pig, Impala,
Sqoop, Zookeeper and Sparks (Scala), Python.
Overstock, Midvale, UT
Dec2013-Dec 2014
Java Developer
Project Title: Media Approval Application (MAA)
Description: The MAA application is an automated way for uploading and approving images which are
to be loaded into the Product page of Overstock.com for the photo Edit team. MAA application not only
approves the images but also validates and squares the images.
Responsibilities:
 Created Front end UI for end users and testing
 Back end development with using core java and java/j2ee components
 Translate the ICD documents before writing code.
 Used MVC struts framework for application design.
 Designed and Developed the web UI using JSP, spring web flow, apache Tiles, CSS Style sheets
HTML, java script.
 Adapted various design patterns like MVC, Data Transfer Object (DTO'S), Business Delegate,
Service Locator, Session Facade, Data Access Objects (DAO's), JPA,.
 Enhanced domain, hosting order creation, billing and invoicing, online payment system with
usage of ActiveMQ, perfectly designed DAO factory on Hibernate component with Spring ORM
framework, JDBC batch process.
 Used XML, XPATH, SAX parser in online payment processing and completing reseller’s orders.



Application coding using Jakarta Struts and Spring Web Tier Framework using MVC.
Developed JSPs, Action Forms, Action Classes, Controllers, Custom Tags reusable components
Designed and developed Hibernate components using DAO factory, defined the complex
mappings for persistent classes using Many-to-Many , Many-To-One and One-To-Many relations
 Developed external interface to send/receive message and configured JMS Queue/Topic and
Message Driven Bean as JMS Message consumers
 Involved in development of Web Services using cxf,spring, hibernate,WSO2 ESB
 Established entire development environment for the entire team using JENKINS, JIRA, and SVN
 Created Junit test for local development tests
 Developed Java Application components and Java API’s to interact with the Oracle BPM
Workflow processes.
 Developed numerous UI (User Interface) screens using, Velocity, HTML5, CSS, Javascript,
jQuery, jQgrid, JSON.
 Used Spring Framework for Dependency injection.
 Used J2EE, Spring Framework, Hibernate.
 Designed and developed DAL Java API Layer using MyBatis, Spring and OraclePL/SQL
Procedures
 Extensively used Hibernate in data access layer to access and update information in the
database.
 Contributed to component designs within the architecture
 Setup maven scripts to compile projects
 Responsible for building .war files in JENKINS for deployment on the test servers and
 During development, testing, and deployment created the branch and tags folders in SVN for the
team.
Environment: JDBC, SQL, MAVEN, SVN, Oracle11g, Web Services, Spring-Framework, Hibernate,
Java6.0, and J-Unit Testing, Tortoise, Windows 7, JSPs, HTML, XSD, XML, JIRA, JENKINS (CONTINUOUS
INTEGRATION)
Fulcrum Global Technologies, Chicago, IL
Mar2012- Nov2013
Java Developer
Fulcrum Global Technologies is a hybrid management and IT consulting firm focused on global
implementations and deployments, upgrades, and project rescues of enterprise solutions. As part of
building a dashboard for social metrics, a proof of concept was developed. Data was sourced from
variety of unstructured sources such as comments on the website, twitter and other review sites and
was loaded into HDFS using map reduce. Relational tables were built using Hbase for use in the
reporting system (tableau) for further analysis - multiple dashboards for executive and for the action
team were created.
Responsibilities:
 Implemented Junit test cases for all the modules developed to ensure complete code coverage.
 Used JSP, JavaScript, JQuery, AJAX, Strut, CSS, and HTML as data and presentation layer
technology.
 Followed the Agile Scrum software development methodology.
 Developed JSP and HTML pages using CSS as part of the presentation layer.
 Develop business layer components using spring, Strut, JDBC and Hibernate and GUI using
JQuery, AJAX, JavaScript, JSON, XML, XSLT and XHTML.
 Design and develop solutions that are highly reliable, scalable and meet business-defined
services.
 Conducted the SQL performance analysis on Oracle database tables and improved the
performance by SQL tuning.

Technically involved in the analysis, designing and development of various server side
components like DAOs for persistence layer and action classes, JSP Servlets for user interface
layer.
 Developed and consumed web services to read the data from other systems using SOAP and
REST.
 Created applications, connection pools, deployment of JSPs, Servlets, and EJBs in Websphere.
 Participated in the entire System Development Life Cycle of the project.
 Generated SQL queries to test the data from Database.
 Development, review and verification of Test Cases and Test Procedures for Integration, System,
Regression, Performance and User Acceptance testing.
 Developed web applications using Spring MVC, SOA, JQuery and HTML.
 Involved in integrating spring with Hibernate and delegated the persistence operations using
Hibernate Template.
 Worked extensively on Soap UI for mocking the back-end web service calls.
Environment: Java , Spring, JDBC, Hibernate, Oracle, PL/SQL, SQL, Strut, Junit, Servlets, JQuery, web
service, SOAP and REST, AJAX, CSS, XML, JavaScript, JSON, XSLT, XHTML, Maven, TOAD, Solaris,
Windows XP/2003.
Value Labs, Hyderabad, India
Jan2011-Feb2012
Java Developer
Value Labs is an Indian-based global IT services and consulting company that provides custom
information technology and business consulting services. Its corporate office is located in Hyderabad,
India.
Responsibilities:
 Involved in development of processes using agile methodology for collection of consumption,
demand, time of use and interval (load profile) data and interfaces with Billing systems and
Oracle Lodestar MDM system.
 Designed and Developed Web Components (JSP and Servlets) and Deployment descriptor is
used to initialize resources like Servlets and tag libraries.
 Used Struts customs tags such as HTML, Logic, and Bean Tags depending upon the requirement.
 Struts Validation framework is used for validating the data captured in UI forms.
 XSLT is used to translate the XML file to HTML which contains promotions.
 Involved in Development and Deployment of Session Beans, Entity Beans and the inventory
information are stored in Oracle database.
 Message-Driven beans in collaboration with JMS are used for Inventory Management.
 Involved in customer liaison, requirements analysis, functional and technical design,
development, maintenance and support of AMI solution.
 Played key role in critical success of system integration project for US North Eastern Utility –
PECO.
 Used ANT Script to Build WAR and EAR files and deployed on Web Logic.
 Participated in data cleansing, data integrity, data quality implementation activities of received
Meter Data from 1.5 million users of PECO.
 Training programs conducted to field force and customer service teams.
Environment: Java, J2EE, Web Services, JSP, Servlets, Java Beans, JSTL, Struts 1.2, HTML, XML, JMS,
Apache ANT, Apache Axis Web Services, Oracle 9i.
Tech Mahindra, Hyderabad, India
June2008-Dec2010
Java Developer
Tech Mahindra Limited is an Indian multinational provider of information technology (IT), networking
technology solutions and Business Process Outsourcing (BPO) to the telecommunications industry.
Anand Mahindra is the founder of Tech Mahindra, which is headquartered at Pune, India.
Responsibilities:
 Analyzed all the test cases based on the requirements gathered and documented for unit testing
as well as for integration testing.
 Designed the user interface required for the portal with all the components for selection of plan.
 Provided the design using Restful Web Services to populate the individual details of plans
available for the customers to pick.
 Programmed functionality for all the components in the user interface interacting with the
database using MySQL Server.
 Developed various Controller classes and business logic using the spring libraries which interact
with the middle tier to perform the business operations.
 Responsible to develop the custom tools as per the client needs.
 Tested the application by programming the test cases using JUnit for both units testing and
Integration testing and bug tracking for the entire application.
 Performed bug testing using Jira tool.
 Working on weekly targets to provide the solution for the product.
 Implemented Junit Test cases for written code for the code quality.
Environment: Core Java, JSP, MySQL, SOAP, Junit, Eclipse, HTML, JavaScript, XML.
Education:
Bachelors in Computer Science, Visveswaraiah Technological University, India.
Certifications:
Cloudera Certified Developer for Apache Hadoop.