Residential College | false |
Status | 已發表Published |
An Adaptive Deep Metric Learning Loss Function for Class-Imbalance Learning via Intraclass Diversity and Interclass Distillation | |
Du,Jie1; Zhang,Xiaoci1; Liu,Peng2; Vong,Chi Man2; Wang,Tianfu1 | |
2023 | |
Source Publication | IEEE Transactions on Neural Networks and Learning Systems |
ISSN | 2162-237X |
Pages | 1-15 |
Abstract | Deep metric learning (DML) has been widely applied in various tasks (e.g., medical diagnosis and face recognition) due to the effective extraction of discriminant features via reducing data overlapping. However, in practice, these tasks also easily suffer from two class-imbalance learning (CIL) problems: data scarcity and data density, causing misclassification. Existing DML losses rarely consider these two issues, while CIL losses cannot reduce data overlapping and data density. In fact, it is a great challenge for a loss function to mitigate the impact of these three issues simultaneously, which is the objective of our proposed intraclass diversity and interclass distillation (IDID) loss with adaptive weight in this article. IDID-loss generates diverse features within classes regardless of the class sample size (to alleviate the issues of data scarcity and data density) and simultaneously preserves the semantic correlations between classes using learnable similarity when pushing different classes away from each other (to reduce overlapping). In summary, our IDID-loss provides three advantages: 1) it can simultaneously mitigate all the three issues while DML and CIL losses cannot; 2) it generates more diverse and discriminant feature representations with higher generalization ability, compared with DML losses; and 3) it provides a larger improvement on the classes of data scarcity and density with a smaller sacrifice on easy class accuracy, compared with CIL losses. Experimental results on seven public real-world datasets show that our IDID-loss achieves the best performances in terms of G-mean, F1-score, and accuracy when compared with both state-of-the-art (SOTA) DML and CIL losses. In addition, it gets rid of the time-consuming fine-tuning process over the hyperparameters of loss function. |
Keyword | Class-imbalance Learning (Cil) Correlation Deep Metric Learning (Dml) Diverse And Discriminant Feature Face Recognition Feature Extraction Learning Systems Loss Function With Adaptive Weights Semantic Correlations Between Classes Semantics Task Analysis Ultrasonic Imaging |
DOI | 10.1109/TNNLS.2023.3286484 |
URL | View the original |
Indexed By | SCIE |
Language | 英語English |
WOS Research Area | Computer Science ; Engineering |
WOS Subject | Computer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic |
WOS ID | WOS:001025538200001 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Scopus ID | 2-s2.0-85163501117 |
Fulltext Access | |
Citation statistics | |
Document Type | Journal article |
Collection | University of Macau |
Affiliation | 1.Guangdong Key Laboratory for Biomedical Measurements and Ultrasound Imaging, School of Biomedical Engineering, Medical School, National-Regional Key Technology Engineering Laboratory for Medical Ultrasound, Shenzhen University, Shenzhen, China 2.Department of Computer and Information Science, University of Macau, Macau, SAR |
Recommended Citation GB/T 7714 | Du,Jie,Zhang,Xiaoci,Liu,Peng,et al. An Adaptive Deep Metric Learning Loss Function for Class-Imbalance Learning via Intraclass Diversity and Interclass Distillation[J]. IEEE Transactions on Neural Networks and Learning Systems, 2023, 1-15. |
APA | Du,Jie., Zhang,Xiaoci., Liu,Peng., Vong,Chi Man., & Wang,Tianfu (2023). An Adaptive Deep Metric Learning Loss Function for Class-Imbalance Learning via Intraclass Diversity and Interclass Distillation. IEEE Transactions on Neural Networks and Learning Systems, 1-15. |
MLA | Du,Jie,et al."An Adaptive Deep Metric Learning Loss Function for Class-Imbalance Learning via Intraclass Diversity and Interclass Distillation".IEEE Transactions on Neural Networks and Learning Systems (2023):1-15. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment