Professional Summary



I'm a Data Engineer/Business Intelligence having 9.0 yrs of experience in Data Engineering, Business Intelligence and Data Science Solutions. Currently, working as BI Engineer at Amazon in Retail Business. Previously worked as Senior Consultant (BIE) at Deloitte into Analytics & Congnitive (ACN), SDE at SAS R&D India, Data Engineer at Citigroup and Associate Consultant (DE) at Capgemini .

I hold expertise in Data Analysis, ETL/ELT pipelines, Reporting & Analytics, Business reviews and Automative Frameworks with SAS, SQL and Python. I have pursued PG Diploma in Machine Learning & AI from IIIT-Bangalore.

I have always embraced the challenges. Be it the regular BAU projects, where I have introduced new features or extended existing one's or improved efficiency and robustness of the product. Apart of regular work, I take up mentoring, coding challenges for SAS, Python and Stats at HackerRank and SASensei. I am a firm believer of self-driven learning and development, for which Certifications, Trainings and Self Projects are a testimony.

#SAS #Macros #SQL #Python #Pandas #PySpark #QuickSight #Tableau #DataAnalysis #DataPreparation #DataViz #AutomationFramework #StatisticalTechniques #AWS #S3 #Airflow #Git #SAS_DI #DataFlux #SAS_IRM #JupyterNB #Unix #Jira #MS-Office Suite #ProblemSolver #ML_Enthusiast.



Experience

Amazon, India

Business Intelligence Engineer, Jun 2022 - Present
  • Lead BI part, piloting to launch phase of new features/on-demand customization for XX vendor cohorts expanding across marketplaces. (XX core features + YY customizations for ZZ vendor programs)
  • Design & Development of data pipeline for excel reports generation alongwith QS dashboard for internal & external customers. (CSAT score – 4.3)
  • Developed validation framework to increase efficiency of PDR launches. (Saving of XX HCs per launch)
  • Lead & Mentor juniors to handle re-launches with documentation & SOP framework.
  • SQL, Python, Pandas, Quicksight, S3, Excel, Cradle, ETLM, Anaconda.

Deloitte, India

Senior Consultant (BIE), Aug 2021 - Jun 2022
  • Worked as a BIE with a banking client to build new Risk Data Model to comply & generate regulatory reports and dashboards for retail banking.
  • Unified Basel & MI reporting for middle east regions.
  • Analyzed different source systems to build consumption layer for reporting.
  • Developed an automated validation/test suite.
  • SQL, Python, Tableau, SAS, Hive, Jupyter, Anaconda.

SAS Research & Development, India

Senior Software Engineer (DE), Aug 2019 - Jul 2021
  • Implemented new features for Common Report (COREP) in modules of credit risk, capital adequacy ratios, and own funds & Financial Report (FINREP) framework to incorporate EBA Taxonomies for European Union banks in SAS Regulatory and Reporting Solution.
  • Launched 6 new modules of EBA Reporting in SAS plug-n-play product.
  • Implemented the Minimum Loss Coverage – IRB shortfall amount for NPE under capital requirement calculation for Risk Weighted Assets (RWA).
  • Credit default analysis to understand the driving features of loan defaulters.
  • Developed pipeline to track data validation failures & code quality checks (QC) with SAS, Python & batch-scripts.
  • Design and development of automated validation framework to improve the dev-test pipeline.
  • Rebuilding the solution on top of SAS Risk Stratum platform.
  • Hands-on SAS IRM and SAS RFW, EG, Studio, DI, SAS Visual Anlytics, Risk Stratum, Python, Batch-Scripting and SMC.
  • SAS, SQL, Macros, Python, Visual Analytics, EDA, IRM, RFW, GIT, Risk Stratum, ETL, Analysis.

Citigroup, India

Data Engineer, Mar 2019 - Aug 2019
  • Analysed and processed credit portfolios & transactions to generate insights with regular & adhoc reports.
  • These reports enabled cross-selling, targetted campaigns & credit portfolio health maintenance.
  • Performed Cohort analysis for credit card customers for metrics like Retention, Churn, Reactivation and %change.
  • Designed & developed an automated framework for daily, weekly and monthly execution of portfolio reports.
  • Data driven solutions and worked on critical initiatives, to drive data analytics solutions.
  • SAS, SQL, Macros, Python, Shell, ETL, DB, Data Analysis.

Capgemini, India

Associate Consultant (DE), Jul 2016 - Mar 2019
  • Strong foundation in SAS Base, Macros, Functions and SAS/SQL.
  • Developed modular, dyanamic, data-driven and efficient sas programs through macros & functions.
  • Developed SAS DI & dataflux jobs for integration, transformation, standardization and loading of data.
  • Exploratory and metadata analysis to understand the incoming data from ODS.
  • Data quality management through dataflux studio.
    • Data Profiling and creation of jobs for data validation and data monitoring using business rules and tasks.
    • De-Duplication and clustering of customer's data followed by standardization and match-code generation.
  • Developed SAS codes using user-defined functions, macros and procs for implementation of business logics.
  • Enabled automated process to increase the consistency and efficiency of SAS programs.
  • VA Dashboards to visualize customer portfolio across life and non-life insurance category with regions.
  • Experience in requirement gathering, analysis, planning, designing, coding, documentation and unit testing and peer-review.
  • SAS, SAS/SQL, Macros, SAS-DI, DataFlux, Tableau, ETL.

Skill Sets

What Can I Do For You?

My passion to dig business insights out of raw data provides me an edge to complete the tasks swiftly. I'm accountable and highly focussed to deliver results and tend to go an extra mile to improve the process and quality of the solution. Also, my abilities of being a quick learner & experience of working in a collaborated team as well independently have enabled successful project outcome. I am resilient to any changes as the world is continuously evolving and so should we.



I would bring expertise on these 8 aspects of Data Engineering & Science, which will make sure, the Products, Solutions and Services that you offer continue to be at par-industry-standard, robust and market leader.

Data Collection AND Ingestion

Key Focus: Acquiring data from diverse sources into a centralized system. Core Activities: Designing and implementing batch and real-time data pipelines. Handling structured, semi-structured, and unstructured data formats. Ensuring reliable, low-latency data ingestion using tools like Apache Kafka, Flume, or AWS Kinesis.

Data Transformation and Processing

Data Prepartion forms the core of any BI and Data Science products. I have been exposed to different phases of prepartion like fetching, integration, transformation, cleansing, quality enhancement & Standardization, Loading & Archiving. The data is integrated from different DB's, Raw files & other applications to get a unified view of the business. Due to inconsistency in data across sources, prep forms an integral part. Data is then loaded to central repo in a DB to be used subsequently for different analytical use cases. I have implemented these with SAS Coding including Macros, SQL, Python, DI Studio (self-service etl tool) and Dataflux Studio (DQ tool), VA & Tableau (visualization tool). With the correct & consistent data, the model performance increases. Put simply, data preparation is the process of taking raw data and getting it ready for ingestion in an analytics platform.

Business Intel Solutions

As of now, I have been a part of building an analytics platform, MIS and Campaign Reporting Solution and Regulatory Risk Management and Reporting Solution, mainly focussed in BFSI industry. I have implemented e-DWH and dashboarding solution from scratch using SAS, SQL, DI Studio, Visual Analytics. This helped to generate actionable insights for my client. The MIS and Campaign Reporting was implemented for segmentation, retention, churn and similar metrics. These analysis enabled Product Managers to roll-out offers and campaigns. The Regulatory Risk solution to calculate the banking health by calculating the required capital and analysis of credit defaulters to understand it's driving features. The Regulatory Reporting product to enable financial institutions for their financial and common reporting submission to the authorities. These solutions have laid a strong foundation for the BFSI domain and credit risk management.

Automation Framework(s)

In the present world, we are getting rid of manual intervention in the technological process to support business decisions. For the similar cause, I did develop a Reporting Framework which automates the MIS reporting of Credit Card Portfolio and Campaign management. The process starts with the fetching of transactional data, followed by data preparation, trend analysis, report generation in multi formats. The main features are OS-Independent, Password-protected contents to exchange, Dynamic report creation and their Automated E-Mails. At SAS R&D, I had built a robust dev-test pipeline to generate and reconcile regulatory reports across different configurations. It handshaked with different applications and servers like IRM, RFW, Metadata using SAS, SQL, Python & Batch scripting. This helped to have a holistic view of the features impacted due to the ongoing development.

Data Storage & Management

Key Focus: Storing data in systems optimized for performance, scalability, and cost-efficiency. Core Activities: Setting up and managing data lakes (e.g., S3, Azure Data Lake) and data warehouses (e.g., Snowflake, Redshift, BigQuery). Designing data models and schemas for efficient storage and retrieval. Implementing strategies for data partitioning, indexing, and compression to improve storage and query performance.

Data Accessibility and Governance

Key Focus: Making data available for analysis while ensuring security and compliance. Core Activities: Enabling seamless access to data through APIs, BI tools, or direct queries. Implementing role-based access controls (RBAC) and encryption. Ensuring compliance with data privacy regulations (e.g., GDPR, CCPA). Monitoring data pipelines to maintain reliability and address bottlenecks.

Operational Excellence

Key Focus: To improve the ongoing DE process in terms of efficiency, scalability, infra maintenance, customer support and documentation. Core Activities: Building generic Utilities to perform repetitive tasks like cluster cleanup, tackle bottlenecks, long running queries, etc Writing well-commented codes to explain the business logic. Regular broadcast to downstream users w.r.t to schema updates, busines logic updates and data quality issues. Regular updates in the documentation repo for all the re-designs, new programs, prrof-of-concepts, etc

Data Science

I have an understanding of customer analytics using their behavioural and demographic information to run customer segmentation and campaigns. Worked on different techniques like linear & logistic regression, random forest and clustering to solve prediction and classification use-cases like sales prediction, loan defaulters, churn prediction and customer segmentation for the purpose of cross-selling, loyalty rewards program and exclusive campaigns. During the data prep, hundreds of datasets and libraries are analysed for their metadata athrough automation.



Skills

#
Skill
Stars out of 5
Components
Category
01
SAS
4.5
Base, Macros, SQL
Programming Language
02
SQL
4.5
DML, DDL
Programming Language
03
Python
4.0
DS, Branching, Looping, Functions
Programming Language
04
Data Preparation
4.5
ETL, DWH
Techniques/Tools
05
Data Visualization
4.0
QuickSight, Excel, VA Dashboards
Techniques/Tools
06
Data Analysis
4.0
Statistical, Data Quality, Metadata Analyses
Techniques
07
Automation
4.0
Frameworks creation with SAS, Unix Shell and Python
Process
08
Data Science Frameworks
3.0
NumPy, Pandas, Matplotlib
Techniques/Tools
09
Machine Learning
2.5
Prediction, Classification
Algorithms
10
SAS Products
4.0
SAS Studio, EG, DI, DataFlux, VA, IRM, RFW, SMC, VIYA
Tools
11
Unix Shell
2.5
Unix commands, Scripts
OS
12
EBA Reporting Solution
3.5
Risk Reporting
Solution
13
Datawarehouse Solution
3.5
Integration, ETL, Visualization
Solution
14
Regulatory Risk Solution
3.0
Regulatory Capital Calculator
Solution


  1. International Institute of Information Technology (IIIT-B), Bengaluru

    Deemed, Post Graduation Diploma (PGD), Machine Learning & AI : 2020-2021
  2. PES University (earlier PES-IT), Bengaluru

    VTU, Bachelor of Engineering (BE), Information Science and Engineering, First Class : 2012-2016
  3. Ekalavya Educational Complex, Patna

    CBSE, Higher Secondary Graduate, Science and Mathematics, First Class with Distinction : 2011
  4. St. Karen's High School, Patna

    ICSE, Secondary Graduate, First Class with Distinction : 2009


Certifications