Publications by Pranshu Malviya
-
Pranshu Malviya
Research Areas: Lifelong Learning, Causal Inference, Optimization
Activity
- PhD Student: Jan 2022 - now
- Master's Student: Jan 2021 - Dec 2021
Master's thesis
-
TAG: Task-based Accumulated Gradients for Lifelong learning
by Pranshu Malviya, with Balaraman Ravindran and Sarath Chandar as supervisors.
Indian Institute of Technology, Madras ⸺ January 2022.
[thesis]
Preprints
-
Exploring the Plasticity of Neural Network for NLP Tasks in Continual Learning
Maryam Hashemzadeh, Pranshu Malviya*, Darshan Patil*, and Sarath Chandar
Conference on Lifelong Learning Agents (CoLLAs) Workshop Track, 2024.
#DL, #NLP
-
Predicting the Impact of Model Expansion through the Minima Manifold: A Loss Landscape Perspective
Pranshu Malviya, Jerry Huang, Quentin Fournier, and Sarath Chandar
In ArXiv, 2024.
#DL
[arXiv] -
Feature diversity in self-supervised learning
Pranshu Malviya* and Arjun Vaithilingam Sudhakar*
Conference on Lifelong Learning Agents (CoLLAs) Workshop Track, 2022.
#DL
[arXiv] -
An Introduction to Lifelong Supervised Learning
Shagun Sodhani, Mojtaba Farmazi, Sanket Vaibhav Mehta, Pranshu Malviya, Mohamed Abdelsalam, Janarthanan Rajendran, and Sarath Chandar
In ArXiv, 2022.
#DL
[arXiv]
Conference and Journal Papers
2024
-
Lookbehind-SAM: k steps back, 1 step forward
Gonçalo Mordido, Pranshu Malviya, Aristide Baratin, and Sarath Chandar
International Conference on Machine Learning (ICML), 2024.
#DL
[pmlr], [arXiv], [code], [YouTube] -
Promoting Exploration in Memory-Augmented Adam using Critical Momenta
Pranshu Malviya, Gonçalo Mordido, Aristide Baratin, Reza Babanezhad Harikandeh, Jerry Huang, Simon Lacoste-Julien, Razvan Pascanu, and Sarath Chandar
Transactions on Machine Learning Research (TMLR), 2024.
#DL
[openreview], [arXiv]
2022
-
TAG: Task-based Accumulated Gradients for Lifelong Learning
Pranshu Malviya, Balaraman Ravindran, and Sarath Chandar
Conference on Lifelong Learning Agents (CoLLAs), 2022.
[Workshop on Theory and Foundation of Continual Learning, ICML, 2021]
#DL
[pmlr], [arXiv], [code]