14.5.8.6.13 Loss Functions, Deep Learning, Neural Netowrks

Chapter Contents (Back)
Deep Nets. Neural Networks. Loss Functions.
See also Deep Learning, Deep Nets.
See also Edge Detectors Based on Learning, Neural Nets, etc..

Singh, A.[Abhishek], Pokharel, R.[Rosha], Principe, J.C.[Jose C.],
The C-loss function for pattern classification,
PR(47), No. 1, 2014, pp. 441-453.
Elsevier DOI 1310
Correntropy. For neural network classification. BibRef

Liao, Z.B.[Zhi-Bin], Carneiro, G.[Gustavo],
A deep convolutional neural network module that promotes competition of multiple-size filters,
PR(71), No. 1, 2017, pp. 94-105.
Elsevier DOI 1707
BibRef
Earlier:
The use of deep learning features in a hierarchical classifier learned with the minimization of a non-greedy loss function that delays gratification,
ICIP15(4540-4544)
IEEE DOI 1512
Deep, learning BibRef

Bazi, Y.[Yakoub], Rahhal, M.M.A.[Mohamad M. Al], Alhichri, H.[Haikel], Alajlan, N.[Naif],
Simple Yet Effective Fine-Tuning of Deep CNNs Using an Auxiliary Classification Loss for Remote Sensing Scene Classification,
RS(11), No. 24, 2019, pp. xx-yy.
DOI Link 1912
BibRef

Yuan, Q.Y.[Qun-Yong], Xiao, N.F.[Nan-Feng],
Experimental exploration on loss surface of deep neural network,
IJIST(30), No. 4, 2020, pp. 860-873.
DOI Link 2011
The loss function of the deep neural network is high dimensional, nonconvex and complex. loss surface of deep neural network, Hessian matrix deep neural network, ensemble learning BibRef

Li, C.J.[Cui-Jin], Qu, Z.[Zhong], Wang, S.Y.[Sheng-Ye], Liu, L.[Ling],
A method of cross-layer fusion multi-object detection and recognition based on improved faster R-CNN model in complex traffic environment,
PRL(145), 2021, pp. 127-134.
Elsevier DOI 2104
Multi-object detection, Multi-object recognition, Faster R-CNN, Weighted balanced multi-class cross entropy loss function BibRef

Seo, H., Bassenne, M., Xing, L.,
Closing the Gap Between Deep Neural Network Modeling and Biomedical Decision-Making Metrics in Segmentation via Adaptive Loss Functions,
MedImg(40), No. 2, February 2021, pp. 585-593.
IEEE DOI 2102
Training, Neural networks, Measurement, Adaptation models, Decision making, Deep learning, Harmonic analysis, Deep learning, Segmentation BibRef

Martínez-Cortés, T.[Tomás], González-Díaz, I.[Iván], Díaz-de-María, F.[Fernando],
Training deep retrieval models with noisy datasets: Bag exponential loss,
PR(112), 2021, pp. 107811.
Elsevier DOI 2102
Image retrieval, Noise, Multiple instance learning, Loss functions BibRef

Zadeh, S.G.[Shekoufeh Gorgi], Schmid, M.[Matthias],
Bias in Cross-Entropy-Based Training of Deep Survival Networks,
PAMI(43), No. 9, September 2021, pp. 3126-3137.
IEEE DOI 2108
Training, Hazards, Mathematical model, Entropy, Power measurement, Indexes, Neural networks, Cross-entropy loss, negative log-likelihood loss BibRef

Kang, J.[Jian], Fernandez-Beltran, R.[Ruben], Duan, P.[Puhong], Kang, X.D.[Xu-Dong], Plaza, A.J.[Antonio J.],
Robust Normalized Softmax Loss for Deep Metric Learning-Based Characterization of Remote Sensing Images With Label Noise,
GeoRS(59), No. 10, October 2021, pp. 8798-8811.
IEEE DOI 2109
Measurement, Semantics, Annotations, Feature extraction, Prototypes, Noise measurement, Visualization, Deep metric learning, remote sensing (RS) BibRef

Tian, Y.[Ye], Dong, Y.X.[Yu-Xin], Yin, G.S.[Gui-Sheng],
Early Labeled and Small Loss Selection Semi-Supervised Learning Method for Remote Sensing Image Scene Classification,
RS(13), No. 20, 2021, pp. xx-yy.
DOI Link 2110
BibRef


Draxler, F.[Felix], Schwarz, J.[Jonathan], Schnörr, C.[Christoph], Köthe, U.[Ullrich],
Characterizing the Role of a Single Coupling Layer in Affine Normalizing Flows,
GCPR20(1-14).
Springer DOI 2110
Award, GCPR, HM. BibRef

Schwarz, J.[Jonathan], Draxler, F.[Felix], Köthe, U.[Ullrich], Schnörr, C.[Christoph],
Riemannian SOS-Polynomial Normalizing Flows,
GCPR20(218-231).
Springer DOI 2110
BibRef

Kobayashi, T.[Takumi],
Group Softmax Loss with Discriminative Feature Grouping,
WACV21(2614-2623)
IEEE DOI 2106
Training, Supervised learning, Neural networks, Training data, Loss measurement BibRef

Chan, C.H.[Chi-Ho], Kittler, J.V.[Josef V.],
Angular Sparsemax for Face Recognition,
ICPR21(10473-10479)
IEEE DOI 2105
Loss function in deep networks training. Additives, Databases, Face recognition, Optimized production technology, Probability distribution, Convolutional neural networks BibRef

Bechtle, S.[Sarah], Molchanov, A.[Artem], Chebotar, Y.[Yevgen], Grefenstette, E.[Edward], Righetti, L.[Ludovic], Sukhatme, G.[Gaurav], Meier, F.[Franziska],
Meta Learning via Learned Loss,
ICPR21(4161-4168)
IEEE DOI 2105
Choosing the loss function in learning. Training, Shape, Transfer learning, Pipelines, Reinforcement learning, Tools, meta learning, deep learning BibRef

Liu, L.L.[Lan-Lan], Wang, M.Z.[Ming-Zhe], Deng, J.[Jia],
A Unified Framework of Surrogate Loss by Refactoring and Interpolation,
ECCV20(III:278-293).
Springer DOI 2012
BibRef

Zhu, Z., Wang, H.,
Deep Adversarial Active Learning With Model Uncertainty For Image Classification,
ICIP20(1711-1715)
IEEE DOI 2011
Task analysis, Uncertainty, Training, Predictive models, Data models, Labeling, Loss measurement, Active learning, Adversarial learning, Image classification BibRef

Wang, Q., Zhang, L., Wu, B., Ren, D., Li, P., Zuo, W., Hu, Q.,
What Deep CNNs Benefit From Global Covariance Pooling: An Optimization Perspective,
CVPR20(10768-10777)
IEEE DOI 2008
Optimization, Training, Task analysis, Convergence, Robustness, Loss measurement, Stability analysis BibRef

Wan, W.T.[Wei-Tao], Zhong, Y.Y.[Yuan-Yi], Li, T.P.[Tian-Peng], Chen, J.S.[Jian-Sheng],
Rethinking Feature Distribution for Loss Functions in Image Classification,
CVPR18(9117-9126)
IEEE DOI 1812
Training, Feature extraction, Probability distribution, Neural networks, Task analysis, Euclidean distance, Loss measurement BibRef

Qi, C., Su, F.,
Contrastive-center loss for deep neural networks,
ICIP17(2851-2855)
IEEE DOI 1803
Face recognition, Feature extraction, Neural networks, Task analysis, Testing, Training, Visualization, Auxiliary loss, Image classification and face recognition BibRef

Sajjadi, M., Javanmardi, M., Tasdizen, T.,
Mutual exclusivity loss for semi-supervised deep learning,
ICIP16(1908-1912)
IEEE DOI 1610
Entropy BibRef

Yoo, D.G.[Dong-Geun], Kweon, I.S.[In So],
Learning Loss for Active Learning,
CVPR19(93-102).
IEEE DOI 2002
BibRef

Chapter on Pattern Recognition, Clustering, Statistics, Grammars, Learning, Neural Nets, Genetic Algorithms continues in
Siamese Networks .


Last update:Oct 20, 2021 at 09:45:26