Singh, A.[Abhishek],
Pokharel, R.[Rosha],
Principe, J.C.[Jose C.],
The C-loss function for pattern classification,
PR(47), No. 1, 2014, pp. 441-453.
Elsevier DOI
1310
Correntropy.
For neural network classification.
BibRef
Liao, Z.B.[Zhi-Bin],
Carneiro, G.[Gustavo],
A deep convolutional neural network module that promotes competition
of multiple-size filters,
PR(71), No. 1, 2017, pp. 94-105.
Elsevier DOI
1707
BibRef
Earlier:
The use of deep learning features in a hierarchical classifier
learned with the minimization of a non-greedy loss function that
delays gratification,
ICIP15(4540-4544)
IEEE DOI
1512
Deep, learning
BibRef
Agarwal, N.[Nakul],
Balasubramanian, V.N.[Vineeth N.],
Jawahar, C.V.,
Improving multiclass classification by deep networks using DAGSVM and
Triplet Loss,
PRL(112), 2018, pp. 184-190.
Elsevier DOI
1809
Multiclass classification, Deep networks, DAGSVM, Triplet loss
BibRef
Bazi, Y.[Yakoub],
Rahhal, M.M.A.[Mohamad M. Al],
Alhichri, H.[Haikel],
Alajlan, N.[Naif],
Simple Yet Effective Fine-Tuning of Deep CNNs Using an Auxiliary
Classification Loss for Remote Sensing Scene Classification,
RS(11), No. 24, 2019, pp. xx-yy.
DOI Link
1912
BibRef
Yuan, Q.Y.[Qun-Yong],
Xiao, N.F.[Nan-Feng],
Experimental exploration on loss surface of deep neural network,
IJIST(30), No. 4, 2020, pp. 860-873.
DOI Link
2011
The loss function of the deep neural network is high dimensional,
nonconvex and complex.
loss surface of deep neural network,
Hessian matrix deep neural network, ensemble learning
BibRef
Yan, Y.,
Hao, H.,
Xu, B.,
Zhao, J.,
Shen, F.,
Image Clustering via Deep Embedded Dimensionality Reduction and
Probability-Based Triplet Loss,
IP(29), 2020, pp. 5652-5661.
IEEE DOI
2005
Dimensionality reduction, Feature extraction, Loss measurement,
Clustering algorithms, Unsupervised learning, Manifolds,
dimensionality reduction
BibRef
Li, C.J.[Cui-Jin],
Qu, Z.[Zhong],
Wang, S.Y.[Sheng-Ye],
Liu, L.[Ling],
A method of cross-layer fusion multi-object detection and recognition
based on improved faster R-CNN model in complex traffic environment,
PRL(145), 2021, pp. 127-134.
Elsevier DOI
2104
Multi-object detection, Multi-object recognition, Faster R-CNN,
Weighted balanced multi-class cross entropy loss function
BibRef
Seo, H.,
Bassenne, M.,
Xing, L.,
Closing the Gap Between Deep Neural Network Modeling and Biomedical
Decision-Making Metrics in Segmentation via Adaptive Loss Functions,
MedImg(40), No. 2, February 2021, pp. 585-593.
IEEE DOI
2102
Training, Neural networks, Measurement, Adaptation models,
Decision making, Deep learning, Harmonic analysis, Deep learning,
Segmentation
BibRef
Martínez-Cortés, T.[Tomás],
González-Díaz, I.[Iván],
Díaz-de-María, F.[Fernando],
Training deep retrieval models with noisy datasets:
Bag exponential loss,
PR(112), 2021, pp. 107811.
Elsevier DOI
2102
Image retrieval, Noise, Multiple instance learning, Loss functions
BibRef
Zadeh, S.G.[Shekoufeh Gorgi],
Schmid, M.[Matthias],
Bias in Cross-Entropy-Based Training of Deep Survival Networks,
PAMI(43), No. 9, September 2021, pp. 3126-3137.
IEEE DOI
2108
Training, Hazards, Mathematical model, Entropy, Power measurement,
Indexes, Neural networks, Cross-entropy loss,
negative log-likelihood loss
BibRef
Tian, Y.[Ye],
Dong, Y.X.[Yu-Xin],
Yin, G.S.[Gui-Sheng],
Early Labeled and Small Loss Selection Semi-Supervised Learning
Method for Remote Sensing Image Scene Classification,
RS(13), No. 20, 2021, pp. xx-yy.
DOI Link
2110
BibRef
Deng, W.,
Zheng, L.,
Sun, Y.,
Jiao, J.,
Rethinking Triplet Loss for Domain Adaptation,
CirSysVideo(31), No. 1, January 2021, pp. 29-37.
IEEE DOI
2101
Semantics, Feature extraction, Measurement, Adaptation models,
Data models, Image color analysis, Sun, Domain adaptation,
semantic alignment
BibRef
Lyu, S.W.[Si-Wei],
Fan, Y.B.[Yan-Bo],
Ying, Y.M.[Yi-Ming],
Hu, B.G.[Bao-Gang],
Average Top-k Aggregate Loss for Supervised Learning,
PAMI(44), No. 1, January 2022, pp. 76-86.
IEEE DOI
2112
Aggregates, Training, Training data, Supervised learning,
Data models, Loss measurement, Task analysis, Aggregate loss,
learning theory
BibRef
Cao, Y.Z.[Yu-Zhou],
Liu, S.Q.[Shu-Qi],
Xu, Y.T.[Yi-Tian],
Multi-complementary and unlabeled learning for arbitrary losses and
models,
PR(124), 2022, pp. 108447.
Elsevier DOI
2203
Multi-complementary, Unlabeled learning,
Empirical risk minimization, Unbiased estimator, Classification
BibRef
Zhang, Z.X.[Zhao-Xiang],
Luo, C.C.[Chuan-Chen],
Wu, H.P.[Hai-Ping],
Chen, Y.T.[Yun-Tao],
Wang, N.Y.[Nai-Yan],
Song, C.F.[Chun-Feng],
From Individual to Whole:
Reducing Intra-class Variance by Feature Aggregation,
IJCV(130), No. 3, March 2022, pp. 800-819.
Springer DOI
2203
Learn model. different viewpoints can be different model.
BibRef
Murasaki, K.[Kazuhiko],
Ando, S.[Shingo],
Shimamura, J.[Jun],
Semi-Supervised Representation Learning via Triplet Loss Based on
Explicit Class Ratio of Unlabeled Data,
IEICE(E105-D), No. 4, April 2022, pp. 778-784.
WWW Link.
2204
BibRef
Wu, S.K.[Sheng-Kai],
Yang, J.R.[Jin-Rong],
Wang, X.G.[Xing-Gang],
Li, X.P.[Xiao-Ping],
IoU-Balanced loss functions for single-stage object detection,
PRL(156), 2022, pp. 96-103.
Elsevier DOI
2205
IoU-Balanced classification loss,
IoU-Balanced localization loss, Object detection, Example mining
BibRef
Mehta, D.[Dhagash],
Chen, T.R.[Tian-Ran],
Tang, T.T.[Ting-Ting],
Hauenstein, J.D.[Jonathan D.],
The Loss Surface of Deep Linear Networks Viewed Through the Algebraic
Geometry Lens,
PAMI(44), No. 9, September 2022, pp. 5664-5680.
IEEE DOI
2208
Geometry, Mathematical model, Deep learning, Analytical models,
Upper bound, Neurons, Task analysis, Deep linear network,
numerical algebraic geometry
BibRef
Zhang, Q.[Qiang],
Yang, J.B.[Ji-Bin],
Zhang, X.W.[Xiong-Wei],
Cao, T.Y.[Tie-Yong],
SO-softmax loss for discriminable embedding learning in CNNs,
PR(131), 2022, pp. 108877.
Elsevier DOI
2208
Convolutional neural networks, Cosine similarity,
Cross entropy loss, Quadratic transformation, Softmax
BibRef
Oner, D.[Doruk],
Kozinski, M.[Mateusz],
Citraro, L.[Leonardo],
Dadap, N.C.[Nathan C.],
Konings, A.G.[Alexandra G.],
Fua, P.[Pascal],
Promoting Connectivity of Network-Like Structures by Enforcing Region
Separation,
PAMI(44), No. 9, September 2022, pp. 5401-5413.
IEEE DOI
2208
Deep networks on network-like structures.
Roads, Irrigation, Training, Image reconstruction,
Image segmentation, Annotations, Topology, connectivity
BibRef
Yao, Q.M.[Quan-Ming],
Yang, H.[Hansi],
Hu, E.L.[En-Liang],
Kwok, J.T.[James T.],
Efficient Low-Rank Semidefinite Programming With Robust Loss
Functions,
PAMI(44), No. 10, October 2022, pp. 6153-6168.
IEEE DOI
2209
Optimization, Convex functions, Convergence, Robustness,
Machine learning algorithms, Sparse matrices, Symmetric matrices,
alternating direction method of multipliers
BibRef
Marchetti, F.,
Guastavino, S.,
Piana, M.,
Campi, C.,
Score-Oriented Loss (SOL) functions,
PR(132), 2022, pp. 108913.
Elsevier DOI
2209
Supervised machine learning, Binary classification,
Loss functions, Skill scores
BibRef
Clough, J.R.[James R.],
Byrne, N.[Nicholas],
Oksuz, I.[Ilkay],
Zimmer, V.A.[Veronika A.],
Schnabel, J.A.[Julia A.],
King, A.P.[Andrew P.],
A Topological Loss Function for Deep-Learning Based Image
Segmentation Using Persistent Homology,
PAMI(44), No. 12, December 2022, pp. 8766-8778.
IEEE DOI
2212
Image segmentation, Topology, Shape, Training, Loss measurement,
Neural networks, Network topology, Segmentation,
convolutional neural networks
BibRef
Charte, D.[David],
Charte, F.[Francisco],
Herrera, F.[Francisco],
Reducing Data Complexity Using Autoencoders With Class-Informed Loss
Functions,
PAMI(44), No. 12, December 2022, pp. 9549-9560.
IEEE DOI
2212
Complexity theory, Feature extraction, Measurement, Shape,
Support vector machines, Data models, Transforms, Autoencoders,
data complexity
BibRef
Elezi, I.[Ismail],
Seidenschwarz, J.[Jenny],
Wagner, L.[Laurin],
Vascon, S.[Sebastiano],
Torcinovich, A.[Alessandro],
Pelillo, M.[Marcello],
Leal-Taixé, L.[Laura],
The Group Loss++: A Deeper Look Into Group Loss for Deep Metric
Learning,
PAMI(45), No. 2, February 2023, pp. 2505-2518.
IEEE DOI
2301
Measurement, Convolutional neural networks, Task analysis,
Neural networks, Image retrieval, Training,
image clustering
BibRef
Liu, G.Q.[Guo-Qi],
Bai, L.[Lu],
Li, J.L.[Jun-Lin],
Li, X.S.[Xu-Sheng],
Ru, L.Y.[Lin-Yuan],
Chang, B.F.[Bao-Fang],
Dynamically adaptive adjustment loss function biased towards
few-class learning,
IET-IPR(17), No. 2, 2023, pp. 627-635.
DOI Link
2302
BibRef
Tang, H.[Hao],
Zhao, G.S.[Guo-Shuai],
Wu, Y.X.[Yu-Xia],
Qian, X.M.[Xue-Ming],
Multisample-Based Contrastive Loss for Top-K Recommendation,
MultMed(25), No. , 2023, pp. 339-351.
IEEE DOI
2302
Business process re-engineering, Training, Task analysis, Faces,
Entropy, Convolution, Measurement, Contrastive loss,
graph convolution network
BibRef
Patterson, A.[Andrew],
Liao, V.[Victor],
White, M.[Martha],
Robust Losses for Learning Value Functions,
PAMI(45), No. 5, May 2023, pp. 6157-6167.
IEEE DOI
2304
Approximation algorithms, Optimization, Function approximation,
Prediction algorithms, Visualization, Tuning,
function approximation
BibRef
Wang, L.[Le],
Zhou, M.[Mo],
Niu, Z.X.[Zhen-Xing],
Zhang, Q.L.[Qi-Lin],
Zheng, N.N.[Nan-Ning],
Adaptive Ladder Loss for Learning Coherent Visual-Semantic Embedding,
MultMed(25), 2023, pp. 1133-1147.
IEEE DOI
2305
Measurement, Visualization, Semantics, Training, Loss measurement,
User experience, Extraterrestrial measurements,
coherent score
BibRef
Choi, S.M.[Seung-Min],
Lee, S.I.[Seung-Ik],
Lee, J.Y.[Jae-Yeong],
Kweon, I.S.[In So],
Semantic-guided de-attention with sharpened triplet marginal loss for
visual place recognition,
PR(141), 2023, pp. 109645.
Elsevier DOI
2306
Visual place recognition, Image retrieval,
Triplet marginal loss, Attention, De-attention, Semantic segmentation
BibRef
Zhou, X.[Xiong],
Liu, X.M.[Xian-Ming],
Zhai, D.[Deming],
Jiang, J.J.[Jun-Jun],
Ji, X.Y.[Xiang-Yang],
Asymmetric Loss Functions for Noise-Tolerant Learning:
Theory and Applications,
PAMI(45), No. 7, July 2023, pp. 8094-8109.
IEEE DOI
2306
Noise measurement, Task analysis, Training, Loss measurement,
Image denoising, Noise robustness, Classification,
regression
BibRef
Zhang, K.[Kai],
Song, C.Y.[Cheng-Yun],
Qiu, L.P.[Lian-Peng],
Self-paced deep clustering with learning loss,
PRL(171), 2023, pp. 8-14.
Elsevier DOI
2306
Deep clustering, Self-paced learning, Loss prediction
BibRef
Hu, S.[Shu],
Wang, X.[Xin],
Lyu, S.W.[Si-Wei],
Rank-Based Decomposable Losses in Machine Learning: A Survey,
PAMI(45), No. 11, November 2023, pp. 13599-13620.
IEEE DOI
2310
BibRef
Raymond, C.[Christian],
Chen, Q.[Qi],
Xue, B.[Bing],
Zhang, M.J.[Meng-Jie],
Learning Symbolic Model-Agnostic Loss Functions via Meta-Learning,
PAMI(45), No. 11, November 2023, pp. 13699-13714.
IEEE DOI
2310
BibRef
Luo, M.J.[Meng-Jiang],
Min, W.Q.[Wei-Qing],
Wang, Z.L.[Zhi-Ling],
Song, J.J.[Jia-Jun],
Jiang, S.Q.[Shu-Qiang],
Ingredient Prediction via Context Learning Network with
Class-Adaptive Asymmetric Loss,
IP(32), 2023, pp. 5509-5523.
IEEE DOI Code:
HTML Version.
2310
BibRef
Wang, G.Z.[Guang-Zhi],
Guo, Y.Y.[Yang-Yang],
Xu, Z.W.[Zi-Wei],
Wong, Y.K.[Yong-Kang],
Kankanhalli, M.S.[Mohan S.],
Semantic-Aware Triplet Loss for Image Classification,
MultMed(25), 2023, pp. 4563-4572.
IEEE DOI
2310
BibRef
Jiang, S.W.[Shen-Wang],
Li, J.A.[Jian-An],
Zhang, J.Z.[Ji-Zhou],
Wang, Y.[Ying],
Xu, T.F.[Ting-Fa],
Dynamic Loss for Robust Learning,
PAMI(45), No. 12, December 2023, pp. 14420-14434.
IEEE DOI
2311
BibRef
Wang, H.J.[Hua-Jun],
Shao, Y.H.[Yuan-Hai],
Fast generalized ramp loss support vector machine for pattern
classification,
PR(146), 2024, pp. 109987.
Elsevier DOI
2311
Generalized ramp loss, -SVM, P-stationary point,
proximal operator, Working set, support vectors, -
BibRef
Zhang, S.Q.[Shuang-Qing],
Li, C.L.[Cheng-Long],
Jia, Z.[Zhen],
Liu, L.[Lei],
Zhang, Z.[Zhang],
Wang, L.[Liang],
Diag-IoU Loss for Object Detection,
CirSysVideo(33), No. 12, December 2023, pp. 7671-7683.
IEEE DOI
2312
BibRef
Wu, T.T.[Ting-Ting],
Ding, X.[Xiao],
Zhang, H.[Hao],
Gao, J.L.[Jing-Long],
Tang, M.J.[Min-Ji],
Du, L.[Li],
Qin, B.[Bing],
Liu, T.[Ting],
DiscrimLoss: A Universal Loss for Hard Samples and Incorrect Samples
Discrimination,
MultMed(26), 2024, pp. 1957-1968.
IEEE DOI
2402
Training, Estimation, Task analysis, Switches, Predictive models,
Noise measurement, Data models, Machine learning, deep learning,
robust methods
BibRef
Li, X.[Xue],
Yu, J.[Jiong],
Jiang, S.C.[Shao-Chen],
Lu, H.C.[Hong-Chun],
Li, Z.Y.[Zi-Yang],
MSViT: Training Multiscale Vision Transformers for Image Retrieval,
MultMed(26), 2024, pp. 2809-2823.
IEEE DOI
2402
Transformers, Image retrieval, Feature extraction, Task analysis,
Computational modeling, Codes, Training, Image retrieval,
triplet loss function
BibRef
Wang, X.S.[Xiao-Shun],
Li, Y.H.[Yun-Han],
Zhang, X.L.[Xiang-Liang],
Improved triplet loss for domain adaptation,
IET-CV(18), No. 1, 2024, pp. 84-96.
DOI Link
2403
image classification
BibRef
Guo, K.Y.[Kai-Yu],
Lovell, B.C.[Brian C.],
Domain-aware triplet loss in domain generalization,
CVIU(243), 2024, pp. 103979.
Elsevier DOI Code:
WWW Link.
2405
Domain generalization, Contrastive learning, Domain dispersion
BibRef
Li, J.X.[Ji-Xing],
Guo, X.Z.[Xiao-Zhou],
Dai, B.[Benzhe],
Gong, G.L.[Guo-Liang],
Jin, M.[Min],
Chen, G.[Gang],
Mao, W.Y.[Wen-Yu],
Lu, H.X.[Hua-Xiang],
ACQ: Improving generative data-free quantization via attention
correction,
PR(152), 2024, pp. 110444.
Elsevier DOI
2405
Generative data-free quantization, Homogenous attention,
Attention center matching, Adversarial loss, Consistency penalty
BibRef
Akhtar, M.[Mushir],
Tanveer, M.,
Arshad, M.[Mohd.],
RoBoSS: A Robust, Bounded, Sparse, and Smooth Loss Function for
Supervised Learning,
PAMI(47), No. 1, January 2025, pp. 149-160.
IEEE DOI
2412
Support vector machines, Fasteners, Robustness, Training,
Supervised learning, Optimization, Biological system modeling,
support vector machine (SVM)
BibRef
Ribeiro, E.S.[Eduardo S.],
Araújo, L.R.G.[Lourenço R.G.],
Chaves, G.T.L.[Gabriel T.L.],
Braga, A.P.[Antônio P.],
Distance-based loss function for deep feature space learning of
convolutional neural networks,
CVIU(249), 2024, pp. 104184.
Elsevier DOI
2412
Loss function, Convolutional neural network,
Distances matrices, Feature extraction
BibRef
Hai, Z.Y.[Zhao-Yang],
Pan, L.Y.[Li-Yuan],
Liu, X.B.[Xia-Bi],
Han, M.Q.[Meng-Qiao],
L2T-DFM: Learning to Teach with Dynamic Fused Metric,
PR(159), 2025, pp. 111124.
Elsevier DOI
2412
Learning to teach, Dynamic loss function, Optimization
BibRef
Croitoru, F.A.[Florinel-Alin],
Ristea, N.C.[Nicolae-Catalin],
Ionescu, R.T.[Radu Tudor],
Sebe, N.[Nicu],
Learning Rate Curriculum,
IJCV(133), No. 1, January 2025, pp. 291-314.
Springer DOI
2501
Code:
WWW Link.
BibRef
Croitoru, F.A.[Florinel-Alin],
Grigore, D.N.[Diana-Nicoleta],
Ionescu, R.T.[Radu Tudor],
Discriminability-enforcing loss to improve representation learning,
ECV22(2597-2601)
IEEE DOI
2210
Training, Representation learning, Impurities, Neural networks,
Transformers, Entropy
BibRef
Zhang, Z.M.[Zi-Ming],
Shao, Y.P.[Yu-Ping],
Zhang, Y.Q.[Yi-Qing],
Lin, F.Z.[Fang-Zhou],
Zhang, H.C.[Hai-Chong],
Rundensteiner, E.[Elke],
Deep Loss Convexification for Learning Iterative Models,
PAMI(47), No. 3, March 2025, pp. 1501-1513.
IEEE DOI
2502
Iterative methods, Point cloud compression, Training, Optimization,
Deep learning, Convergence, Shape, Surveys, Fasteners, Convexification,
multimodel image alignment
BibRef
Kimura, M.[Masanari],
Naganuma, H.[Hiroki],
Geometric insights into focal loss: Reducing curvature for enhanced
model calibration,
PRL(189), 2025, pp. 195-200.
Elsevier DOI
2503
Model calibration, Focal loss, Loss surface
BibRef
Aketi, S.A.[Sai Aparna],
Roy, K.[Kaushik],
Cross-feature Contrastive Loss for Decentralized Deep Learning on
Heterogeneous Data,
WACV24(12-21)
IEEE DOI
2404
Training, Deep learning, Network topology, Computational modeling,
Distributed databases, Computer architecture, Algorithms, and algorithms
BibRef
Wang, Z.[Zijia],
Yang, W.B.[Wen-Bin],
Liu, Z.S.[Zhi-Song],
Ni, J.C.[Jia-Cheng],
Chen, Q.[Qiang],
Jia, Z.[Zhen],
The Oil and Water Separation Phenomenon Inspired Loss for Feature
Learning,
ICIP23(1510-1514)
IEEE DOI
2312
BibRef
Fang, F.[Fen],
Hoang, N.M.[Nhat M.],
Xu, Q.L.[Qian-Li],
Lim, J.H.[Joo-Hwee],
Data Augmentation Using Corner CutMix and an Auxiliary
Self-Supervised Loss,
ICIP23(830-834)
IEEE DOI
2312
BibRef
Aburaed, N.[Nour],
Alkhatib, M.Q.[Mohammed Q.],
Marshall, S.[Stephen],
Zabalza, J.[Jaime],
Al Ahmad, H.[Hussain],
Bayesian Hybrid Loss for Hyperspectral SISR Using 3D Wide Residual
CNN,
ICIP23(2115-2119)
IEEE DOI Code:
WWW Link.
2312
BibRef
Uyanik, K.[Korcan],
Yeganli, S.F.[S. Faegheh],
Bajic, I.V.[Ivan V.],
Grad-FEC:
Unequal Loss Protection of Deep Features in Collaborative Intelligence,
ICIP23(3140-3144)
IEEE DOI
2312
BibRef
Yu, Y.[Yunrui],
Xu, C.Z.[Cheng-Zhong],
Efficient Loss Function by Minimizing the Detrimental Effect of
Floating-Point Errors on Gradient-Based Attacks,
CVPR23(4056-4066)
IEEE DOI
2309
BibRef
Bueno-Benito, E.B.[Elena Belén],
Vecino, B.T.[Biel Tura],
Dimiccoli, M.[Mariella],
Leveraging triplet loss for unsupervised action segmentation,
L3D-IVU23(4922-4930)
IEEE DOI
2309
BibRef
Puttagunta, R.S.[Raghunath Sai],
Li, Z.[Zhu],
Bhattacharyya, S.[Shuvra],
York, G.[George],
Appearance Label Balanced Triplet Loss for Multi-modal Aerial View
Object Classification,
PBVS23(534-542)
IEEE DOI
2309
BibRef
Yang, R.[Rui],
Vo, D.M.[Duc Minh],
Nakayama, H.[Hideki],
Indirect Adversarial Losses via an Intermediate Distribution for
Training GANs,
WACV23(4641-4650)
IEEE DOI
2302
Measurement, Training, Generative adversarial networks, Generators,
Adversarial machine learning, Convergence
BibRef
Patel, Y.[Yash],
Tolias, G.[Giorgos],
Matas, J.G.[Jirí G.],
Recall@k Surrogate Loss with Large Batches and Similarity Mixup,
CVPR22(7492-7501)
IEEE DOI
2210
Measurement, Training, Visualization, Memory management,
Image retrieval, Graphics processing units, Benchmark testing,
Representation learning
BibRef
Li, H.[Hao],
Fu, T.W.[Tian-Wen],
Dai, J.F.[Ji-Feng],
Li, H.S.[Hong-Sheng],
Huang, G.[Gao],
Zhu, X.Z.[Xi-Zhou],
AutoLoss-Zero: Searching Loss Functions from Scratch for Generic
Tasks,
CVPR22(999-1008)
IEEE DOI
2210
Protocols, Codes, Evolutionary computation,
Extraterrestrial measurements,
Scene analysis and understanding
BibRef
Hebbalaguppe, R.[Ramya],
Prakash, J.[Jatin],
Madan, N.[Neelabh],
Arora, C.[Chetan],
A Stitch in Time Saves Nine: A Train-Time Regularizing Loss for
Improved Neural Network Calibration,
CVPR22(16060-16069)
IEEE DOI
2210
Training, Image segmentation, Neural networks, Semantics,
Natural languages, Picture archiving and communication systems,
privacy and ethics in vision
BibRef
Han, D.[Dasol],
Yoo, J.W.[Jae-Wook],
Oh, D.[Dokwan],
SeeThroughNet: Resurrection of Auxiliary Loss by Preserving Class
Probability Information,
CVPR22(4453-4462)
IEEE DOI
2210
Deep learning, Shape, Semantics, Transfer learning, Neural networks,
Object detection, Benchmark testing, Segmentation,
Scene analysis and understanding
BibRef
Chen, Y.M.[Yi-Ming],
Deligiannis, N.[Nikos],
Locally Accumulated Adam For Distributed Training With Sparse Updates,
ICIP23(2395-2399)
IEEE DOI
2312
BibRef
Abrahamyan, L.[Lusine],
Ziatchin, V.[Valentin],
Chen, Y.M.[Yi-Ming],
Deligiannis, N.[Nikos],
Bias Loss for Mobile Neural Networks,
ICCV21(6536-6546)
IEEE DOI
2203
Training, Computational modeling, Neural networks,
Benchmark testing, Data models,
Scene analysis and understanding
BibRef
Scott, T.R.[Tyler R.],
Gallagher, A.C.[Andrew C.],
Mozer, M.C.[Michael C.],
von Mises-Fisher Loss: An Exploration of Embedding Geometries for
Supervised Learning,
ICCV21(10592-10602)
IEEE DOI
2203
Geometry, Training, Systematics, Transfer learning,
Supervised learning, Stochastic processes, Predictive models,
Recognition and classification
BibRef
Warburg, F.[Frederik],
Jørgensen, M.[Martin],
Civera, J.[Javier],
Hauberg, S.[Søren],
Bayesian Triplet Loss: Uncertainty Quantification in Image Retrieval,
ICCV21(12138-12148)
IEEE DOI
2203
Uncertainty, Computational modeling, Image retrieval,
Stochastic processes, Bayes methods, Computational efficiency,
BibRef
Ranasinghe, K.[Kanchana],
Naseer, M.[Muzammal],
Hayat, M.[Munawar],
Khan, S.[Salman],
Khan, F.S.[Fahad Shahbaz],
Orthogonal Projection Loss,
ICCV21(12313-12323)
IEEE DOI
2203
Deep learning, Image recognition, Neural networks, Force,
Linear programming, Robustness,
Transfer/Low-shot/Semi/Unsupervised Learning
BibRef
Samuel, D.[Dvir],
Chechik, G.[Gal],
Distributional Robustness Loss for Long-tail Learning,
ICCV21(9475-9484)
IEEE DOI
2203
Training, Head, Upper bound, Computational modeling,
Benchmark testing, Feature extraction,
Representation learning
BibRef
Mullapudi, R.T.[Ravi Teja],
Poms, F.[Fait],
Mark, W.R.[William R.],
Ramanan, D.[Deva],
Fatahalian, K.[Kayvon],
Learning Rare Category Classifiers on a Tight Labeling Budget,
ICCV21(8403-8412)
IEEE DOI
2203
Training, Representation learning, Adaptation models, Buildings,
Propagation losses, Data models, Labeling,
Efficient training and inference methods
BibRef
Yu, N.[Ning],
Liu, G.L.[Gui-Lin],
Dundar, A.[Aysegul],
Tao, A.[Andrew],
Catanzaro, B.[Bryan],
Davis, L.S.[Larry S.],
Fritz, M.[Mario],
Dual Contrastive Loss and Attention for GANs,
ICCV21(6711-6722)
IEEE DOI
2203
Image synthesis, Benchmark testing,
Generative adversarial networks, Generators, Image and video synthesis
BibRef
Yuan, Z.N.[Zhuo-Ning],
Yan, Y.[Yan],
Sonka, M.[Milan],
Yang, T.B.[Tian-Bao],
Large-scale Robust Deep AUC Maximization: A New Surrogate Loss and
Empirical Studies on Medical Image Classification,
ICCV21(3020-3029)
IEEE DOI
2203
WWW Link. Dams, Stochastic processes, Benchmark testing, Skin, Task analysis,
Optimization, X-ray imaging, Optimization and learning methods,
Recognition and classification
BibRef
Wang, C.F.[Chao-Fei],
Xiao, J.[Jiayu],
Han, Y.Z.[Yi-Zeng],
Yang, Q.[Qisen],
Song, S.[Shiji],
Huang, G.[Gao],
Towards Learning Spatially Discriminative Feature Representations,
ICCV21(1306-1315)
IEEE DOI
2203
CAM-loss, to constrain the embedded feature maps with the class activation maps.
Visualization, Transfer learning, Drives,
Feature extraction, Cams, Recognition and classification,
Transfer/Low-shot/Semi/Unsupervised Learning
BibRef
Ridnik, T.[Tal],
Ben-Baruch, E.[Emanuel],
Zamir, N.[Nadav],
Noy, A.[Asaf],
Friedman, I.[Itamar],
Protter, M.[Matan],
Zelnik-Manor, L.[Lihi],
Asymmetric Loss For Multi-Label Classification,
ICCV21(82-91)
IEEE DOI
2203
Training, Adaptive systems, Object detection, Benchmark testing,
Complexity theory, Task analysis, Recognition and classification,
Scene analysis and understanding
BibRef
Peeples, J.[Joshua],
McCurley, C.H.[Connor H.],
Walker, S.[Sarah],
Stewart, D.[Dylan],
Zare, A.[Alina],
Learnable Adaptive Cosine Estimator (LACE) for Image Classification,
WACV22(3757-3767)
IEEE DOI
2202
Training, Parameter estimation, Computational modeling,
Artificial neural networks, Transforms, Stability analysis,
Deep Learning Object Detection/Recognition/Categorization
BibRef
Ho, K.[Kalun],
Keuper, J.[Janis],
Pfreundt, F.J.[Franz-Josef],
Keuper, M.[Margret],
Learning Embeddings for Image Clustering:
An Empirical Study of Triplet Loss Approaches,
ICPR21(87-94)
IEEE DOI
2105
Correlation, Noise measurement,
Convolutional neural networks, Image classification
BibRef
Wang, S.[Song],
Guo, X.[Xin],
Tie, Y.[Yun],
Qi, L.[Lin],
Guan, L.[Ling],
Discriminative Patch Descriptor Learning With Focal Triplet Loss
Function,
ICIP21(3567-3571)
IEEE DOI
2201
Training, Image processing, Image matching, Task analysis, Standards,
triplet loss function, focal triplet loss,
visual-semantic embedding learning
BibRef
Liu, Y.F.[Yi-Fan],
Chen, H.[Hao],
Chen, Y.[Yu],
Yin, W.[Wei],
Shen, C.H.[Chun-Hua],
Generic Perceptual Loss for Modeling Structured Output Dependencies,
CVPR21(5420-5428)
IEEE DOI
2111
Training, Image segmentation, Image synthesis,
Semantics, Superresolution, Estimation
BibRef
Yang, M.X.[Mou-Xing],
Li, Y.F.[Yun-Fan],
Huang, Z.Y.[Zhen-Yu],
Liu, Z.T.[Zi-Tao],
Hu, P.[Peng],
Peng, X.[Xi],
Partially View-aligned Representation Learning with Noise-robust
Contrastive Loss,
CVPR21(1134-1143)
IEEE DOI
2111
Robustness, Noise robustness, Spatiotemporal phenomena,
Image restoration, Noise measurement, Object tracking, Object recognition
BibRef
Wang, F.[Feng],
Liu, H.P.[Hua-Ping],
Understanding the Behaviour of Contrastive Loss,
CVPR21(2495-2504)
IEEE DOI
2111
Temperature distribution,
Computational modeling, Semantics, Temperature control,
Task analysis
BibRef
Draxler, F.[Felix],
Schwarz, J.[Jonathan],
Schnörr, C.[Christoph],
Köthe, U.[Ullrich],
Characterizing the Role of a Single Coupling Layer in Affine
Normalizing Flows,
GCPR20(1-14).
Springer DOI
2110
Award, GCPR, HM.
BibRef
Kobayashi, T.[Takumi],
Group Softmax Loss with Discriminative Feature Grouping,
WACV21(2614-2623)
IEEE DOI
2106
Training, Supervised learning,
Neural networks, Training data, Loss measurement
BibRef
Chan, C.H.[Chi-Ho],
Kittler, J.V.[Josef V.],
Angular Sparsemax for Face Recognition,
ICPR21(10473-10479)
IEEE DOI
2105
Loss function in deep networks training.
Additives, Databases, Face recognition,
Optimized production technology, Probability distribution,
Convolutional neural networks
BibRef
Bechtle, S.[Sarah],
Molchanov, A.[Artem],
Chebotar, Y.[Yevgen],
Grefenstette, E.[Edward],
Righetti, L.[Ludovic],
Sukhatme, G.[Gaurav],
Meier, F.[Franziska],
Meta Learning via Learned Loss,
ICPR21(4161-4168)
IEEE DOI
2105
Choosing the loss function in learning.
Training, Shape, Transfer learning, Pipelines,
Reinforcement learning, Tools, meta learning, deep learning
BibRef
Liu, L.L.[Lan-Lan],
Wang, M.Z.[Ming-Zhe],
Deng, J.[Jia],
A Unified Framework of Surrogate Loss by Refactoring and Interpolation,
ECCV20(III:278-293).
Springer DOI
2012
BibRef
Zhu, Z.,
Wang, H.,
Deep Adversarial Active Learning With Model Uncertainty For Image
Classification,
ICIP20(1711-1715)
IEEE DOI
2011
Task analysis, Uncertainty, Training, Predictive models, Data models,
Labeling, Loss measurement, Active learning, Adversarial learning,
Image classification
BibRef
Wang, Q.,
Zhang, L.,
Wu, B.,
Ren, D.,
Li, P.,
Zuo, W.,
Hu, Q.,
What Deep CNNs Benefit From Global Covariance Pooling:
An Optimization Perspective,
CVPR20(10768-10777)
IEEE DOI
2008
Optimization, Training, Task analysis, Convergence, Robustness,
Loss measurement, Stability analysis
BibRef
Cacheux, Y.L.,
Borgne, H.L.,
Crucianu, M.,
Modeling Inter and Intra-Class Relations in the Triplet Loss for
Zero-Shot Learning,
ICCV19(10332-10341)
IEEE DOI
2004
image representation, learning (artificial intelligence),
vectors, class prototypes, implicit assumptions,
Covariance matrices
BibRef
Yang, Z.B.[Zhi-Bo],
Bastan, M.[Muhammet],
Zhu, X.L.[Xin-Liang],
Gray, D.[Doug],
Samaras, D.[Dimitris],
Hierarchical Proxy-based Loss for Deep Metric Learning,
WACV22(449-458)
IEEE DOI
2202
Measurement, Training, Image retrieval,
Clustering algorithms, Data models, Complexity theory,
Object Detection/Recognition/Categorization
BibRef
Zhao, X.,
Qi, H.,
Luo, R.,
Davis, L.,
A Weakly Supervised Adaptive Triplet Loss for Deep Metric Learning,
Fashion19(3177-3180)
IEEE DOI
2004
image retrieval, neural nets, search problems, supervised learning,
deep metric learning, distance metric learning,
adaptive triplet loss
BibRef
Qian, Q.,
Shang, L.,
Sun, B.,
Hu, J.,
Tacoma, T.,
Li, H.,
Jin, R.,
SoftTriple Loss: Deep Metric Learning Without Triplet Sampling,
ICCV19(6449-6457)
IEEE DOI
2004
learning (artificial intelligence), neural nets, optimisation,
pattern classification, sampling methods, SoftTriple loss,
Data models
BibRef
Yu, B.,
Tao, D.,
Deep Metric Learning With Tuplet Margin Loss,
ICCV19(6489-6498)
IEEE DOI
2004
learning (artificial intelligence),
deep metric learning datasets, deep metric learning methods, Bars
BibRef
Do, T.T.[Thanh-Toan],
Tran, T.[Toan],
Reid, I.D.[Ian D.],
Kumar, V.[Vijay],
Hoang, T.[Tuan],
Carneiro, G.[Gustavo],
A Theoretically Sound Upper Bound on the Triplet Loss for Improving the
Efficiency of Deep Distance Metric Learning,
CVPR19(10396-10405).
IEEE DOI
2002
BibRef
Yu, B.S.[Bao-Sheng],
Liu, T.L.[Tong-Liang],
Gong, M.M.[Ming-Ming],
Ding, C.X.[Chang-Xing],
Tao, D.C.[Da-Cheng],
Correcting the Triplet Selection Bias for Triplet Loss,
ECCV18(VI: 71-86).
Springer DOI
1810
Metric learning technique.
BibRef
Ge, W.F.[Wei-Feng],
Huang, W.L.[Wei-Lin],
Dong, D.[Dengke],
Scott, M.R.[Matthew R.],
Deep Metric Learning with Hierarchical Triplet Loss,
ECCV18(VI: 272-288).
Springer DOI
1810
BibRef
Wan, W.T.[Wei-Tao],
Zhong, Y.Y.[Yuan-Yi],
Li, T.P.[Tian-Peng],
Chen, J.S.[Jian-Sheng],
Rethinking Feature Distribution for Loss Functions in Image
Classification,
CVPR18(9117-9126)
IEEE DOI
1812
Training, Feature extraction, Probability distribution,
Neural networks, Task analysis, Euclidean distance, Loss measurement
BibRef
Qi, C.,
Su, F.,
Contrastive-center loss for deep neural networks,
ICIP17(2851-2855)
IEEE DOI
1803
Face recognition, Feature extraction, Neural networks,
Task analysis, Testing, Training, Visualization, Auxiliary loss,
Image classification and face recognition
BibRef
Sajjadi, M.,
Javanmardi, M.,
Tasdizen, T.,
Mutual exclusivity loss for semi-supervised deep learning,
ICIP16(1908-1912)
IEEE DOI
1610
Entropy
BibRef
Yoo, D.G.[Dong-Geun],
Kweon, I.S.[In So],
Learning Loss for Active Learning,
CVPR19(93-102).
IEEE DOI
2002
BibRef
Chapter on Pattern Recognition, Clustering, Statistics, Grammars, Learning, Neural Nets, Genetic Algorithms continues in
Siamese Networks .