Jupinder Parmar

Incoming Deep Learning Applied Scientist, NVIDIA

jparmar2066 [AT] gmail.com


I recently graduated with a B.S in Mathematics and M.S in Computer Science from Stanford University where I concentrated my studies within probability and machine learning. I was fortunate to be advised by Amir Dembo for my undergraduate degree and Christopher Ré for my graduate degree. At Stanford I spent my time exploring a variety of areas but with each my goal has been to learn how to levarage my understanding of random phenomena and artificial intelligience to develop exciting ideas and products that will revolutionize our world. I believe that these two concepts can not only guide us to a deeper understanding of the problems that challenge our lives but also give way to robust, innovative solutions.

After taking a short break for the Summer, I'll be joining NVIDIA as a Deep Learning Applied Scientist on the Conversational AI team in Fall 2022. I couldn't be more excited about this opportunity and believe it'll be an amazing experience with which to launch my career.

// Currently, I'm:

Traveling around Europe and Australia for the Summer of 2022 to take a much needed break after graduation.

Working with Tatsu Hashimoto on improving group robusntess of machine learning models. Current models achieve high average performance but can incur high error on certain groups of rare and atypical examples. We are researching new avenues to improve the worst-group performance of models and hope to build architectures that disregard spurious correlates by learning the casual features for a task.

// In the past, I've:

Worked as a teaching assistant for CS 234: Reinforcement Learning where I had the wonderful opportunity to help introduce the exciting field of reinforcement learning to students. Teaching has always been a passion of mine and it was gratifying to help aid in the educational journey of others.

Researched in Chris Re's Lab at Stanford where I worked with Khaled Saab. We worked on making use of passively observed human signals, such as one's gaze, to learn better representations of tasks and help train more robust, generalizable deep learning models.

Worked as a Machine Learning Engineer for OccamzRazor. OccamzRazor is developing a biomedical knoweldge graph which aims to advance understanding of drug efficacy and provide novel treatments for diseases. My main focus of work was on expanding out their natural langauge processing pipeline by researching and engineering new architectures for biomedical information extraction. I've additionally spent time exploring how to make graph learning algorithms more interpretable.

Implemented architectures described in various machine learning research papers from scratch in PyTorch. I wanted to dive deeper into the content I was learning in my classes and found that the best way to do so was to get hands on experience with the intracies of each model. Some of my favorite architectures that I implemented were ResNet and a Neural Image Caption Generation Model.

Worked as a quantitative trading intern at IMC Trading on the Small Index Options Desk. I developed new trading strategies to improve performance on US Options Auctions by leveraging information regarding market sentiment.

Worked as a software engineering intern at SAS on the Enterprise Computing team. I aided in the team's shift to Kubernetes and developed a machine learning model to predict when potential workloads would cause system failures.

Researched with David Chin at UMass Amherst where we investigated the intersection of quantum and classical machine learning. In particuar, we looked to see whether quantum models could help us better assess the quality of care provided by hospitals.

Worked with David Doty at UC Davis where we built a domain specific language that allowed researchers to implement complex theoretical models in the field of algorithmic self assembly.

// Outside of Work I Enjoy:


Observational Supervision
Using passively observed human signals to improve medical imaging models.
Group Robustness
Improving model performance on atypical subgroups of data.
Biomedical Information Extraction
Uncovering biomedical data silos currently stuck within millions of pieces of literature.

Publications and Written Pieces

The Importance of Background Information for Out of Distribution Generalization PDF

Jupinder Parmar, Khaled Saab, Brian Pogatchnik, Daniel Rubin, Christopher Ré

Workshop on Spurious Correlations, Invariance, and Stability at International Conference of Machine Learning (ICML 2022)

Observational Supervision for Medical Image Classification using Gaze Data PDF

Khaled Saab, Sarah Hooper, Nimit Sohoni, Jupinder Parmar, Brian Pogatchnik, Sen Wu, Jaredd Dunnmon, Hongyang Zhang, Daniel Rubin, Christopher Ré

Medical Image Computing and Computer Assisted Intervention (MICCAI 2021)

Biomedical Information Extraction For Disease Gene Prioritization PDF

Jupinder Parmar, William Koehler, Martin Bringmann, Katharina Sophia Volz, Berk Kapicioglu

Knowledge Representation and Reasoning Meets Machine Learning Workshop at Neural Information Processing Systems (NeurIPS 2020)

A Formulation of a Matrix Sparsity Approach for the Quantum Ordered Search Algorithm PDF

Jupinder Parmar, Saarim Rahman, Jesse Thiara

International Journal of Quantum Information (2017)

The Gap: Where Machine Learning Education Falls Short PDF

Jupinder Parmar

The Gradient


  • OccamzRazor September 2021 - June 2022
    Machine Learning Engineer
    Working on NLP
  • IMC Trading Summer 2021
    Quantitative Trading Intern
    Developed Trading Strategies for U.S ETF Option Auctions
  • Stanford University 2020 - 2022
    M.S. in Computer Science. Advised by Christopher Ré.
  • OccamzRazor March 2020 - June 2021
    Machine Learning Engineer
    Worked on NLP and Graph Learning
  • Stanford AI Lab November 2019 - now
    Working on Representation Learning and Domain Generalization
  • SAS Institute Summer 2019
    Software Engineer Intern
    Used Machine Learning to Predict System Failures
  • Stanford University 2018 - 2022
    B.S. in Mathematics. Advised by Amir Dembo.


This website draws design inspiration from Tatsunori Hashimoto and Jason Zhao.