Get C2C/W2 Jobs & hotlist update

Prakhar Chaudhary – Data Scientist/Data Engineer –  13+ years Exp – Our own H1B – Seattle, WA – Willing to go onsite from Day 1

Prakhar Chaudhary  – Data Scientist/Data Engineer –  13+ years Exp – Our own H1B  – Seattle,WA Willing to go onsite from Day 1

Consultant's Details:  Employer Details:
Consultant Name: Prakhar ChaudharyEmployer Name:Nextgen Technologies Inc
Work Visa: Our own H1BContact Person:Kushal Desai
Current Location: Seattle,WAEmail:kushal.desai@nextgentechinc.com
Relocation: Willing to go onsite from Day 1
Phone: +1 (413) 424-0484
 
Note: Please call after 09:00 AM PST
  
 

Prakhar Chaudhary's Resume

 

Summary:

  • I have dynamic 13+ years’ experience as a Data Scientist/Data Analyst with a master’s in business Intelligence and deep experience analyzing commercial data using reporting tools like Tableau, Python, SAS, and SQL across Fintech companies.
  • Well-versed in deriving viable solutions to complex business problems through big data analysis and management.
  • Solid understanding of Power BI desktop, Power Query, and DAX formulas.
  • Proficient in containerization and orchestration using Docker and Kubernetes for scalable deployment of ML applications.
  • Implemented comprehensive monitoring and logging using Prometheus, Grafana, ELK Stack, and Azure Monitor to ensure model and infrastructure health.
  • Designed and implemented MLOps solutions on AWS, GCP, and Azure, leveraging cloud-native services for scalable and resilient deployments.
  • Utilized tools such as Apache Airflow, Luigi, and Kubeflow Pipelines to automate end-to-end ML workflows, from data ingestion to model deployment.
  • Developed comprehensive monitoring and alerting systems using tools like Prometheus, Grafana, and CloudWatch to ensure high availability and performance.
  • Implemented automated model drift detection systems to monitor and alert model performance degradation over time.
  • Collaborative work with cross-functional teams and utilization of diverse technologies (Python, Scala, TensorFlow, PyTorch).
  • Utilized python’s flask framework for building REST APIs on top of Data Lake (BigQuery, Cloud SQL).
  • Achieved Continuous Integration &Continuous Deployment (CI/CD) for applications using Git, Azure Devops.
  • Experience with Test driven development (TDD), Agile methodologies and SCRUM processes.
  • Hands on solving problems which brings significant business value by building predictive & forcasting models utilizing structured & unstructured data.
  • Extensive experience managing and engaging stakeholders at all levels including communicating technical concepts to non-technical audiences.

Areas of Expertise include:

Data Analysis | Data Science | SQL | Tableau | Data Visualization | Python | Stakeholder Engagement Agile Scrum Master | Agile Product Owner | Lean | CI/CD | TDD | Gap Analysis | Process Mapping Data Modelling | Process Improvement | Problem-solving | Communication & Influencing

 

EDUCATION

  • Master’s. Information Technology Management | University of Texas, Dallas, USA 2015
  • B.Tech. Information Technology | Narsee Monjee Institute of Management Studies, Mumbai, India 2010

PROFESSIONAL EXPERIENCE:

Intuit (Mountain View)

Sr Data Scientist/Business Data Analyst 4                                                                                                        Aug 23 – Present

  • I worked on multiple analytics projects like cohort analysis, GNS analysis, and mobile analysis for the QBLive team.
  • Maintain and update the Intuit Assist (IAQB) Tableau dashboard.
  • I ran the OIPRO A/B test experiment from scratch and built a Tableau dashboard to view the performance and present insights in the WinRoom Experiment review.
  • Developed automated workflows for model retraining and deployment based on data drift and performance metrics using Apache Airflow and Kubeflow Pipelines.
  • Integrated LLM-powered chatbots into eBay's messaging system to provide real-time responses to customer queries, improving response times and user satisfaction.
  • Designed and implemented robust machine learning pipelines using Kubernetes(K8)/AKS with Argo Workflow orchestration, ensuring scalable and efficient end-to-end ML processes.
  • Managed data pipelines for ML workflows on Kubernetes(K8)/AKS using Argo Workflows, ensuring efficient data movement and transformation between pipeline stages.
  • Built and maintained end-to-end machine learning pipelines, from data ingestion to model deployment, using tools like Apache Airflow, MLflow, and Kubeflow.
  • Designed and implemented CI/CD pipelines for machine learning models using Jenkins, GitLab CI, and Azure DevOps, ensuring rapid and reliable model deployment.
  • Containerized machine learning applications using Docker and deployed them on Kubernetes clusters, ensuring high availability and scalability.
  • Implemented monitoring and logging solutions using Prometheus, Grafana, ELK Stack, and Azure Monitor to track model performance and infrastructure health.
  • Utilized MLflow and DVC for model versioning, tracking, and reproducibility, ensuring consistent and reliable model deployments.
  • Developed Spark code using Scala and Spark-SQL for faster processing and testing, integrating MLOps practices for efficient development workflows.
  • Performed data cleaning and feature selection using MLlib package in PySpark, working with deep learning frameworks such as Caffe with considerations for MLOps.
  • Integrated CI/CD pipelines with Argo Workflows and AKS to automate the deployment of updated machine learning models, ensuring continuous delivery and integration.
  • Created an algorithm that can predict the type of the object in a typical house using Deep Learning. Used OpenCV for the image analysis and keras and Tensorflow for implementing artificial neural networks (ANN).
  • Developed doctor report cards for real-time insights into their performance over the years. Using Apache Kafka for data ingestion and Tableau integrated with Hadoop/Spark for creating the reports.
  • Create a self-serve tool to view DIWM trailers performance and automate all the processes.
  • Make a lot of enhancements to the GSU SOT dashboard.

Apple (Sunnyvale)                                                                                                         Jun 21 – Aug 23

Senior Data Analyst

  • Implement BERT NER NLP model on Apple Care raw unstructured data to identify personal health information of customers and redact it.
  • Create API also to find PHI and customer passwords from raw customer data.

Achievements/Tasks

  • Collaborated with data engineers and operation team to implement ETL process, wrote and optimized SQL queries to perform data extraction to fit the analytical requirements.
  • Built NLP models including BERT, and XGBoost to find personal health information.
  • Performed univariate and multivariate analysis on the data to identify any underlying pattern in the data and associations between the variables.
  • Performed data imputation using Scikit-learn package in Python.
  • Analyzed customer data and market trends to identify new growth opportunities.
  • Utilized data visualization tools and statistical analysis to create comprehensive reports and presentations, providing valuable insights to the executive team and supporting data-driven decision-making.
  • Participated in features engineering’s such as feature intersection generating, feature normalization and label encoding with Scikit-learn pre-processing.
  • Used Python 3.X (NumPy, SciPy, pandas, scikit-learn, seaborn) to develop various models and algorithms for analytic purposes.
  • Created and managing reports, dashboards, and visualizations using Tableau.
  • Built multiple Splunk dashboards for API usage and set up usage notifications on slack and email using slack webhooks.

GSK (India)                                                                                                         Oct 20 – Jun 21

Data Analytics Manager

  • Owned and manage algorithm team at GSK. Implement Next Best Action project to 30 countries to drive pharma growth.

Achievements/Tasks

  • Optimize the current algorithm use case to provide better recommendation to achieve 1.5% growth in revenue overall.
  • Implement NBA project to 30 new global markets.

 

  • Helped execute and analyze data pipelines for algorithm.

 

 

 

INTUIT (Sydney)                                                                                                         Jan 19 – Feb 20

Senior Business Data Analyst

  • Owned and execute all web reporting for Australia business with heavy clickstream data usage. Developed A/B test reporting back-end tool in Tableau for all AU web tests. Automated reporting using pyspark and python.

Achievements/Tasks

  • Spearheaded a big data processing project. Drastically expanded predictive analytics and behaviour analysis capabilities. Produced substantial profitable results for Intuit.
  • Analyze and interpret data from different sources, including Excel, SQL Server, and other databases and transform raw data into meaningful insights that enable informed decision-making.
  • Conducted a data regression analysis of the relationship between product prices and industry trends, achieving a 20% more accurate prediction of performance than previous years.
  • Used predictive analytics such as machine learning and data mining techniques to forecast company sales of new products with an 80% accuracy rate.
  • Helped execute and analyze AU IPD ML tips test. Contributed to a 5% lift in retention.

INTUIT (Mountain View)                                                                                                         Oct 17 – Jan 19

Data Scientist, Technical Analytics

  • Partnered with marketing, finance, analytics, and cross-functional teams to interpret large volumes of data, address key business questions, from hypothesis to execution, aligned with strategy and tactics that lead to actionable, measurable insights.
  • Advocated for the exploration of interesting data anomalies or patterns that may provide more explanatory details about customer behaviors or predictive value to the business by writing SQL queries on multiple databases.

Achievements/Tasks

  • Setup end to end analytics requirement for new product launches, QB Detect and Defend which resulted in $200k revenue.
  • Collaborate with different teams, including data analysts, business analysts, and stakeholders, to create Power BI reports and dashboards that align with business requirements.
  • Provide training and support to other team members on the use of Power BI.
  • Developed a marketing funnel to describe the product purchase cycle and determine customer leakage.
  • Created and automated A/B Testing to drive insights around marketing and product experiments, gathering metrics and providing inputs to drive business decisions. $50m revenue growth based on results.
  • Created multiple Tableau Dashboards that provided a self-service tool to our stakeholders.
  • Helped create a Master Segmentation model (regression analysis) used to target Desktop and QBO customers.
  • Automated a manual report using shell script. Move data from Splunk to Vertica also using REST API.
  • Created an insight report from Omniture to tell the story of how Web sales/campaigns are performing.

FOCUSKPI (Mountain View)                                                                                                        Aug 15 – Oct 17

Database Marketing Analyst

  • Partnered with marketing, analytics, and cross-functional teams to interpret large volumes of data, address key business questions, from hypothesis to execution, aligned with strategy and tactics that lead to actionable, measurable insights.
  • Provide deep analytics, A/B testing support, and database analytics as instructed by your supervisor.

SPLASH MEDIA LLC                                                                                                         Dec 14 – May 15

Data Scientist Intern

  • Provided in-depth analysis of information from multiple social platforms. Prepared monthly reports for Facebook Business and Consumer page for all the Countries. Extracted data from tools like Spread Fast, Facebook Insights, Google Analytics and Brand watch to generate reports.

Achievements/Tasks

  • Created two Tableau Dashboard for clients to view summarized data.
  • Create ML model to find right product in B2C space.
  • Developed a robust digital analytics report framework for clients spread across three industrial sectors using tools.

HCL TECHNOLOGY                                                                                                          Jul 10 – May 13

Senior Analyst

  • Ran multiple marketing campaigns for B2C.
  • Created and extracted tables from SAS and ORACLE by using SAS/Access and SAS/SQL that is used for modeling purposes. Generated the results for the Regression, Correlation studies and Analysis of Variance (ANOVA). Mentored new employees and conducted extensive database training.

Achievements/Tasks

  • Improved efficiency by 10 % through implementing Six Sigma methodology.

Note: Please call between 09:00 AM PST to 06:00 PM PST

Kushal Desai

| 1735 N 1St ST., Suite 102 |San Jose, CA 95112

NextGen Technologies Inc

Email: kushal.desai@nextgentechinc.com. Website: www.nextgentechinc.com | +1 (413) 424-0484 |

 

 

To unsubscribe from future emails or to update your email preferences click here

Table of Contents

About Author

JOHN KARY graduated from Princeton University in New Jersey and backed by over a decade, I am Digital marketing manager and voyage content writer with publishing and marketing excellency, I specialize in providing a wide range of writing services. My expertise encompasses creating engaging and informative blog posts and articles.
I am committed to delivering high-quality, impactful content that drives results. Let's work together to bring your content vision to life.

Leave a Reply

Your email address will not be published. Required fields are marked *