Independent Consultant
Financial Services
Information Technology/IT
Insurance
Pharmaceuticals
Australia
Canada
Germany
Japan
United Kingdom
United States
English
Gujarati
Hindi
I am a graduate student in Information Systems at Pace University, specializing in Big Data Engineering. I am interested in the Consulting and helping business build their infrastructure.
My three years of experience in Big Data Engineering, Software Engineering, and Market Research, in various industries including Finance, Pharmaceutical, Health Insurance[CRM], IT Startups.
I’ve led projects in developing database infrastructure and orchestrating the Hadoop ecosystem.Â
Looking for the contract to work on, where I can provide the foundation to the early-stage companies.
I've worked as a Business Analyst and Data Engineer in a team of 6, to plan and build the infrastructure of CRM software for an Insurance company.
Built prototypes of modules such as process models and ERDs and incorporated user feedback.
Translate system user needs and execute dimension reduction based on factor analysis and correlation, using SPSS.
Assisted the ETL team in building processes to support data transformation quality and solving unique dependency.
Assisted Big Data team to integrate data-warehouse with the HDFS and analyzed data using Hive.
Synthesized insights and metrics to provide recommendations that drove critical customer activities in building CRM dashboard of an insurance company, by understanding the need of various system users.
Assisted ETL team to design the logical data flow packages and to determine the storage needs.
Handled Data Normalization and governance issues of velocity, volume, slowly changing dimensions, and variety.
Assisted warehouse team in data extraction, parsing, managing and analyzing from diverse upstream sources.
Create functional, technical, and system requirement documents.
Develop user test cases and look for opportunities for improving the software quality, by developing standard operating procedures.
Develop the process model, and, decompose it to the DFDs to the required granularity.
Investigating data-infrastructure needs, prepare for the approval-document, and present it to the stake-holders.
Design the ERDs, and identify the list of required database tools based on scalability and reliability.
Design, Develop, construct, test, and maintain the architecture of the database, cloud, and Hadoop-distributed file systems.
Assist in Ad-hoc query testing engaged with Kafka and other data sources.
Assist in designing, developing, and testing of ETL pipelines, to integrate source data, employing Kafka.
Embed code snippet with the website log, to analyze the web performance, user behavior and to suggest changes.
Integrate the google analytics data and web/app data with BigQuery, employing automated pipelines.
Assist in Kafka architecture planning by developing use cases for process automation and application on-boarding.
Hadoop Infrastructure planning and cluster deployment to achieve the processing advantages for OLAP.
Identify KPIs, and generate reports and dashboards for stakeholders, and assist business team in decision making.
Masters in Information Systems.
Database Management System | SDLC | Data mining & Visualization | Database Programming | Python | Distributed Computing
Analytics with Google analytics.