Etash Guha

On the Diminishing Returns of Width for Continual Learning

Etash Guha
Vihan Lakshman
International Conference of Machine Learning 2024; ICLR 2024 Workshop on Bridging the Gap Between Practice and Theory in Deep Learning (ICML), 2024

Abstract

Conformal prediction (CP) for regression can be challenging, especially when the output distribution is heteroscedastic, multimodal, or skewed. Some of the issues can be addressed by estimating a distribution over the output, but in reality, such approaches can be sensitive to estimation error and yield unstable intervals.~Here, we circumvent the challenges by converting regression to a classification problem and then use CP for classification to obtain CP sets for regression.~To preserve the ordering of the continuous-output space, we design a new loss function and present necessary modifications to the CP classification techniques.~Empirical results on many benchmarks shows that this simple approach gives surprisingly good results on many practical problems.

Materials

Project
PDF

BibTeX