Out-of-distribution detection with likelihoods assigned by deep generative models using multimodal prior distributions

Ryo Kamoi, Kei Kobayashi

Research output: Contribution to journalConference articlepeer-review

Abstract

Modern machine learning systems can exhibit undesirable and unpredictable behavior in response to out-of-distribution inputs. Consequently, applying out-of-distribution detection to address this problem is an active subfield of safe AI. Probability density estimation is one popular approach for outof- distribution detection of low-dimensional data. However, for high dimensional data, recent work has reported that deep generative models can assign higher likelihoods to out-ofdistribution data than to training data. We propose a new method to detect out-of-distribution inputs using deep generative models with multimodal prior distributions. Our experimental results show that our models trained on Fashion- MNIST successfully assign lower likelihoods to MNIST, and successfully function as out-of-distribution detectors.

Original languageEnglish
Pages (from-to)113-116
Number of pages4
JournalCEUR Workshop Proceedings
Volume2560
Publication statusPublished - 2020
Event2020 Workshop on Artificial Intelligence Safety, SafeAI 2020 - New York, United States
Duration: 2020 Feb 7 → …

ASJC Scopus subject areas

  • Computer Science(all)

Fingerprint

Dive into the research topics of 'Out-of-distribution detection with likelihoods assigned by deep generative models using multimodal prior distributions'. Together they form a unique fingerprint.

Cite this