Profile
International Journal of Computer & Software Engineering Volume 6 (2021), Article ID 6:IJCSE-165, 7 pages
https://doi.org/10.15344/2456-4451/2021/165
Research Article
Special Issue: Computational Analysis and Modeling
Improving Accuracy of Out-of-Distribution Detection and In-Distribution Classification by Incorporating JSD Consistency Loss

Kaiyu Suzuki and Tomofumi Matsuzawa*

Department of Information Sciences, Tokyo University of Science, Chiba, Japan
Prof. Tomofumi Matsuzawa, Department of Information Sciences, Tokyo University of Science, Chiba 278-8510, Japan; E-mail: t-matsu@is.noda.tus.ac.jp
07 June 2021; 23 June 2021; 25 June 2021
Suzuki K, Matsuzawa T (2021) Improving Accuracy of Out-of-Distribution Detection and In-Distribution Classification by Incorporating JSD Consistency Loss. Int J Comput Softw Eng 6: 165. doi: https://doi.org/10.15344/2456-4451/2021/165

Abstract

Out-of-distribution (OOD) detection, the classification of samples not included in the training data, is essential to improve the reliability of deep learning. Recently, the accuracy of OOD detection through unsupervised representation learning is high; however, the accuracy of in-distribution classification (IND) is reduced. This is due to the cross entropy, which trains the network to predict shifting transformations (such as angles) for OOD detection. Cross entropy loss conflicts with the consistency in representation learning; that is, samples with different data augmentations applied to the same sample should share the same representation. To avoid this problem, we add the Jensen–Shannon divergence (JSD) consistency loss. To demonstrate its effectiveness for both OOD detection and IN-D classification, we apply it to contrasting shifted instances (CSI) based on the latest representation learning. Our experiments demonstrate that JSD consistency loss outperforms existing methods in both OOD detection and IN-D classification for unlabeled multi-class datasets.