Chen, G.Z.[Guan-Zhou],
Zhang, X.D.[Xiao-Dong],
Tan, X.L.[Xiao-Liang],
Cheng, Y.F.[Yu-Feng],
Dai, F.[Fan],
Zhu, K.[Kun],
Gong, Y.F.[Yuan-Fu],
Wang, Q.[Qing],
Training Small Networks for Scene Classification of Remote Sensing
Images via Knowledge Distillation,
RS(10), No. 5, 2018, pp. xx-yy.
DOI Link
1806
BibRef
Wu, X.[Xiang],
He, R.[Ran],
Hu, Y.[Yibo],
Sun, Z.N.[Zhe-Nan],
Learning an Evolutionary Embedding via Massive Knowledge Distillation,
IJCV(128), No. 8-9, September 2020, pp. 2089-2106.
Springer DOI
2008
transferring knowledge from a large powerful teacher network to a
small compact student one.
BibRef
Zaras, A.[Adamantios],
Passalis, N.[Nikolaos],
Tefas, A.[Anastasios],
Improving knowledge distillation using unified ensembles of
specialized teachers,
PRL(146), 2021, pp. 215-221.
Elsevier DOI
2105
68T99, Knowledge distillation, Knowledge transfer,
Specialized teachers, Unified ensemble, Unified specialized teachers ensemble
BibRef
Bae, J.H.[Ji-Hoon],
Yeo, D.[Doyeob],
Yim, J.[Junho],
Kim, N.S.[Nae-Soo],
Pyo, C.S.[Cheol-Sig],
Kim, J.[Junmo],
Densely Distilled Flow-Based Knowledge Transfer in Teacher-Student
Framework for Image Classification,
IP(29), 2020, pp. 5698-5710.
IEEE DOI
2005
BibRef
Earlier: A2, A1, A5, A3, A4, A6:
Sequential Knowledge Transfer in Teacher-Student Framework Using
Densely Distilled Flow-Based Information,
ICIP18(674-678)
IEEE DOI
1809
Knowledge transfer, Training, Computational modeling, Data mining,
Optimization, Image classification, Computer architecture,
residual network.
Training, Data mining, Optimization, Image classification,
Knowledge transfer, Computational modeling, Reliability,
BibRef
Mazumder, P.[Pratik],
Singh, P.[Pravendra],
Namboodiri, V.P.[Vinay P.],
GIFSL: Grafting based improved few-shot learning,
IVC(104), 2020, pp. 104006.
Elsevier DOI
2012
Few-shot learning, Grafting, Self-supervision, Distillation,
Deep learning, Object recognition
BibRef
Li, X.W.[Xue-Wei],
Li, S.Y.[Song-Yuan],
Omar, B.[Bourahla],
Wu, F.[Fei],
Li, X.[Xi],
ResKD: Residual-Guided Knowledge Distillation,
IP(30), 2021, pp. 4735-4746.
IEEE DOI
2105
BibRef
Nguyen-Meidine, L.T.[Le Thanh],
Belal, A.[Atif],
Kiran, M.[Madhu],
Dolz, J.[Jose],
Blais-Morin, L.A.[Louis-Antoine],
Granger, E.[Eric],
Knowledge distillation methods for efficient unsupervised adaptation
across multiple domains,
IVC(108), 2021, pp. 104096.
Elsevier DOI
2104
BibRef
And:
Unsupervised Multi-Target Domain Adaptation Through Knowledge
Distillation,
WACV21(1338-1346)
IEEE DOI
2106
Deep learning, Convolutional NNs, Knowledge distillation,
Unsupervised domain adaptation, CNN acceleration and compression.
Adaptation models, Computational modeling,
Benchmark testing, Real-time systems
BibRef
Zhang, H.R.[Hao-Ran],
Hu, Z.Z.[Zhen-Zhen],
Qin, W.[Wei],
Xu, M.L.[Ming-Liang],
Wang, M.[Meng],
Adversarial co-distillation learning for image recognition,
PR(111), 2021, pp. 107659.
Elsevier DOI
2012
Knowledge distillation, Data augmentation,
Generative adversarial nets, Divergent examples, Image classification
BibRef
Gou, J.P.[Jian-Ping],
Yu, B.S.[Bao-Sheng],
Maybank, S.J.[Stephen J.],
Tao, D.C.[Da-Cheng],
Knowledge Distillation: A Survey,
IJCV(129), No. 6, June 2021, pp. 1789-1819.
Springer DOI
2106
Survey, Knowledge Distillation.
BibRef
Deng, Y.J.[Yong-Jian],
Chen, H.[Hao],
Chen, H.Y.[Hui-Ying],
Li, Y.F.[You-Fu],
Learning From Images:
A Distillation Learning Framework for Event Cameras,
IP(30), 2021, pp. 4919-4931.
IEEE DOI
2106
Task analysis, Feature extraction, Cameras, Data models,
Streaming media, Trajectory, Power demand, Event-based vision,
optical flow prediction
BibRef
Liu, Y.[Yang],
Wang, K.[Keze],
Li, G.B.[Guan-Bin],
Lin, L.[Liang],
Semantics-Aware Adaptive Knowledge Distillation for Sensor-to-Vision
Action Recognition,
IP(30), 2021, pp. 5573-5588.
IEEE DOI
2106
Videos, Knowledge engineering, Wearable sensors, Adaptation models,
Sensors, Semantics, Image synthesis, Action recognition,
transfer learning
BibRef
Feng, Z.X.[Zhan-Xiang],
Lai, J.H.[Jian-Huang],
Xie, X.H.[Xiao-Hua],
Resolution-Aware Knowledge Distillation for Efficient Inference,
IP(30), 2021, pp. 6985-6996.
IEEE DOI
2108
Knowledge engineering, Feature extraction, Image resolution,
Computational modeling, Computational complexity, Image coding,
adversarial learning
BibRef
Liu, Y.Y.[Yu-Yang],
Cong, Y.[Yang],
Sun, G.[Gan],
Zhang, T.[Tao],
Dong, J.H.[Jia-Hua],
Liu, H.S.[Hong-Sen],
L3DOC: Lifelong 3D Object Classification,
IP(30), 2021, pp. 7486-7498.
IEEE DOI
2109
Task analysis, Solid modeling,
Data models, Knowledge engineering, Shape, Robots,
task-relevant knowledge distillation
BibRef
Bhardwaj, A.[Ayush],
Pimpale, S.[Sakshee],
Kumar, S.[Saurabh],
Banerjee, B.[Biplab],
Empowering Knowledge Distillation via Open Set Recognition for Robust
3D Point Cloud Classification,
PRL(151), 2021, pp. 172-179.
Elsevier DOI
2110
Knowledge Distillation, Open Set Recognition,
3D Object Recognition, Point Cloud Classification
BibRef
Shao, B.[Baitan],
Chen, Y.[Ying],
Multi-granularity for knowledge distillation,
IVC(115), 2021, pp. 104286.
Elsevier DOI
2110
Knowledge distillation, Model compression,
Multi-granularity distillation mechanism, Stable excitation scheme
BibRef
Zhang, L.[Libo],
Du, D.W.[Da-Wei],
Li, C.C.[Cong-Cong],
Wu, Y.J.[Yan-Jun],
Luo, T.J.[Tie-Jian],
Iterative Knowledge Distillation for Automatic Check-Out,
MultMed(23), 2021, pp. 4158-4170.
IEEE DOI
2112
Testing, Training, Adaptation models, Reliability,
Feature extraction, Training data, Task analysis,
iterative knowledge distillation
BibRef
Qin, D.[Dian],
Bu, J.J.[Jia-Jun],
Liu, Z.[Zhe],
Shen, X.[Xin],
Zhou, S.[Sheng],
Gu, J.J.[Jing-Jun],
Wang, Z.H.[Zhi-Hua],
Wu, L.[Lei],
Dai, H.F.[Hui-Fen],
Efficient Medical Image Segmentation Based on Knowledge Distillation,
MedImg(40), No. 12, December 2021, pp. 3820-3831.
IEEE DOI
2112
Image segmentation, Biomedical imaging, Semantics,
Knowledge engineering, Feature extraction, Tumors,
transfer learning
BibRef
Tian, L.[Ling],
Wang, Z.C.[Zhi-Chao],
He, B.[Bokun],
He, C.[Chu],
Wang, D.W.[Ding-Wen],
Li, D.[Deshi],
Knowledge Distillation of Grassmann Manifold Network for Remote
Sensing Scene Classification,
RS(13), No. 22, 2021, pp. xx-yy.
DOI Link
2112
BibRef
Yue, J.[Jun],
Fang, L.[Leyuan],
Rahmani, H.[Hossein],
Ghamisi, P.[Pedram],
Self-Supervised Learning With Adaptive Distillation for Hyperspectral
Image Classification,
GeoRS(60), 2022, pp. 1-13.
IEEE DOI
2112
Feature extraction, Training, Adaptive systems, Mirrors,
Knowledge engineering, Hyperspectral imaging, Spectral analysis,
spatial-spectral feature extraction
BibRef
Chen, J.Z.[Jing-Zhou],
Wang, S.H.[Shi-Hao],
Chen, L.[Ling],
Cai, H.B.[Hai-Bin],
Qian, Y.T.[Yun-Tao],
Incremental Detection of Remote Sensing Objects With Feature Pyramid
and Knowledge Distillation,
GeoRS(60), 2022, pp. 1-13.
IEEE DOI
2112
Feature extraction, Remote sensing, Training, Object detection,
Adaptation models, Proposals, Detectors, Deep learning, remote sensing
BibRef
Chen, H.Y.[Hong-Yuan],
Pei, Y.T.[Yan-Ting],
Zhao, H.W.[Hong-Wei],
Huang, Y.P.[Ya-Ping],
Super-resolution guided knowledge distillation for low-resolution
image classification,
PRL(155), 2022, pp. 62-68.
Elsevier DOI
2203
Low-resolution image classification, Super-resolution, Knowledge distillation
BibRef
Wang, S.L.[Shu-Ling],
Hu, M.[Mu],
Li, B.[Bin],
Gong, X.J.[Xiao-Jin],
Self-Paced Knowledge Distillation for Real-Time Image Guided Depth
Completion,
SPLetters(29), No. 2022, pp. 867-871.
IEEE DOI
2204
Knowledge engineering, Predictive models, Training, Task analysis,
Real-time systems, Color, Loss measurement, self-paced learning
BibRef
Zhang, K.[Kangkai],
Zhang, C.H.[Chun-Hui],
Li, S.[Shikun],
Zeng, D.[Dan],
Ge, S.M.[Shi-Ming],
Student Network Learning via Evolutionary Knowledge Distillation,
CirSysVideo(32), No. 4, April 2022, pp. 2251-2263.
IEEE DOI
2204
Training, Knowledge representation, Knowledge transfer,
Predictive models, Germanium, Data models, Data mining, deep learning
BibRef
Wang, L.[Lin],
Yoon, K.J.[Kuk-Jin],
Knowledge Distillation and Student-Teacher Learning for Visual
Intelligence: A Review and New Outlooks,
PAMI(44), No. 6, June 2022, pp. 3048-3068.
IEEE DOI
2205
Training, Measurement, Computational modeling, Visualization,
Task analysis, Knowledge transfer, Speech recognition,
visual intelligence
BibRef
Wang, L.[Lin],
Chae, Y.J.[Yu-Jeong],
Yoon, S.H.[Sung-Hoon],
Kim, T.K.[Tae-Kyun],
Yoon, K.J.[Kuk-Jin],
EvDistill: Asynchronous Events to End-task Learning via Bidirectional
Reconstruction-guided Cross-modal Knowledge Distillation,
CVPR21(608-619)
IEEE DOI
2111
Training, Knowledge engineering, Semantics,
Dynamic range, Cameras, Data models
BibRef
Li, T.H.[Tian-Hao],
Wang, L.M.[Li-Min],
Wu, G.S.[Gang-Shan],
Self Supervision to Distillation for Long-Tailed Visual Recognition,
ICCV21(610-619)
IEEE DOI
2203
Training, Representation learning, Deep learning, Visualization,
Image recognition, Head, Semantics, Recognition and classification,
Representation learning
BibRef
Fang, Z.Y.[Zhi-Yuan],
Wang, J.F.[Jian-Feng],
Hu, X.W.[Xiao-Wei],
Wang, L.J.[Li-Juan],
Yang, Y.Z.[Ye-Zhou],
Liu, Z.C.[Zi-Cheng],
Compressing Visual-linguistic Model via Knowledge Distillation,
ICCV21(1408-1418)
IEEE DOI
2203
Knowledge engineering, Visualization, Adaptation models, Detectors,
Mean square error methods, Transformers, Vision + language,
Vision applications and systems
BibRef
Yao, L.[Lewei],
Pi, R.J.[Ren-Jie],
Xu, H.[Hang],
Zhang, W.[Wei],
Li, Z.G.[Zhen-Guo],
Zhang, T.[Tong],
G-DetKD: Towards General Distillation Framework for Object Detectors
via Contrastive and Semantic-Guided Feature Imitation,
ICCV21(3571-3580)
IEEE DOI
2203
Semantics, Pipelines, Detectors, Object detection, Benchmark testing,
Feature extraction, Detection and localization in 2D and 3D,
BibRef
Chen, Y.X.[Yi-Xin],
Chen, P.G.[Peng-Guang],
Liu, S.[Shu],
Wang, L.[Liwei],
Jia, J.Y.[Jia-Ya],
Deep Structured Instance Graph for Distilling Object Detectors,
ICCV21(4339-4348)
IEEE DOI
2203
Codes, Image edge detection, Semantics, Detectors, Object detection,
Knowledge representation,
Detection and localization in 2D and 3D
BibRef
Zhu, Y.C.[Yi-Chen],
Wang, Y.[Yi],
Student Customized Knowledge Distillation:
Bridging the Gap Between Student and Teacher,
ICCV21(5037-5046)
IEEE DOI
2203
Knowledge engineering, Training, Visualization, Image segmentation,
Semantics, Object detection,
BibRef
Kim, Y.[Youmin],
Park, J.[Jinbae],
Jang, Y.[YounHo],
Ali, M.[Muhammad],
Oh, T.H.[Tae-Hyun],
Bae, S.H.[Sung-Ho],
Distilling Global and Local Logits with Densely Connected Relations,
ICCV21(6270-6280)
IEEE DOI
2203
Image segmentation, Image recognition, Computational modeling,
Semantics, Object detection, Task analysis,
BibRef
Kim, K.[Kyungyul],
Ji, B.[ByeongMoon],
Yoon, D.[Doyoung],
Hwang, S.[Sangheum],
Self-Knowledge Distillation with Progressive Refinement of Targets,
ICCV21(6547-6556)
IEEE DOI
2203
Training, Knowledge engineering, Adaptation models,
Supervised learning, Neural networks, Object detection,
Recognition and classification
BibRef
Son, W.[Wonchul],
Na, J.[Jaemin],
Choi, J.[Junyong],
Hwang, W.J.[Won-Jun],
Densely Guided Knowledge Distillation using Multiple Teacher
Assistants,
ICCV21(9375-9384)
IEEE DOI
2203
Knowledge engineering, Training, Deep learning, Transfer learning,
Neural networks, Stochastic processes,
Recognition and classification
BibRef
Tejankar, A.[Ajinkya],
Koohpayegani, S.A.[Soroush Abbasi],
Pillai, V.[Vipin],
Favaro, P.[Paolo],
Pirsiavash, H.[Hamed],
ISD: Self-Supervised Learning by Iterative Similarity Distillation,
ICCV21(9589-9598)
IEEE DOI
2203
Codes, Transfer learning, Iterative methods, Task analysis,
Standards, Representation learning, Transfer/Low-shot/Semi/Unsupervised Learning
BibRef
Zhou, S.[Sheng],
Wang, Y.C.[Yu-Cheng],
Chen, D.[Defang],
Chen, J.W.[Jia-Wei],
Wang, X.[Xin],
Wang, C.[Can],
Bu, J.J.[Jia-Jun],
Distilling Holistic Knowledge with Graph Neural Networks,
ICCV21(10367-10376)
IEEE DOI
2203
Knowledge engineering, Correlation, Codes, Knowledge based systems,
Benchmark testing, Feature extraction,
BibRef
Shang, Y.Z.[Yu-Zhang],
Duan, B.[Bin],
Zong, Z.L.[Zi-Liang],
Nie, L.Q.[Li-Qiang],
Yan, Y.[Yan],
Lipschitz Continuity Guided Knowledge Distillation,
ICCV21(10655-10664)
IEEE DOI
2203
Knowledge engineering, Training, Image segmentation, Codes,
NP-hard problem, Neural networks,
Transfer/Low-shot/Semi/Unsupervised Learning
BibRef
Li, Z.[Zheng],
Ye, J.W.[Jing-Wen],
Song, M.L.[Ming-Li],
Huang, Y.[Ying],
Pan, Z.[Zhigeng],
Online Knowledge Distillation for Efficient Pose Estimation,
ICCV21(11720-11730)
IEEE DOI
2203
Heating systems, Computational modeling, Pose estimation,
Benchmark testing, Complexity theory, Knowledge transfer,
Efficient training and inference methods
BibRef
Dai, R.[Rui],
Das, S.[Srijan],
Bremond, F.[François],
Learning an Augmented RGB Representation with Cross-Modal Knowledge
Distillation for Action Detection,
ICCV21(13033-13044)
IEEE DOI
2203
Training, Focusing, Streaming media, Real-time systems,
Task analysis, Action and behavior recognition,
Vision + other modalities
BibRef
Xiang, S.[Sitao],
Gu, Y.M.[Yu-Ming],
Xiang, P.[Pengda],
Chai, M.[Menglei],
Li, H.[Hao],
Zhao, Y.[Yajie],
He, M.M.[Ming-Ming],
DisUnknown: Distilling Unknown Factors for Disentanglement Learning,
ICCV21(14790-14799)
IEEE DOI
2203
Training, Scalability, Benchmark testing, Generators, Task analysis,
Image and video synthesis, Adversarial learning, Neural generative models
BibRef
Diomataris, M.[Markos],
Gkanatsios, N.[Nikolaos],
Pitsikalis, V.[Vassilis],
Maragos, P.[Petros],
Grounding Consistency: Distilling Spatial Common Sense for Precise
Visual Relationship Detection,
ICCV21(15891-15900)
IEEE DOI
2203
Measurement, Visualization, Grounding, Triples (Data structure),
Image edge detection, Predictive models,
Visual reasoning and logical representation
BibRef
Zi, B.[Bojia],
Zhao, S.H.[Shi-Hao],
Ma, X.[Xingjun],
Jiang, Y.G.[Yu-Gang],
Revisiting Adversarial Robustness Distillation:
Robust Soft Labels Make Student Better,
ICCV21(16423-16432)
IEEE DOI
2203
Training, Deep learning, Codes, Computational modeling,
Neural networks, Predictive models, Adversarial learning,
Recognition and classification
BibRef
Zheng, H.[Heliang],
Yang, H.[Huan],
Fu, J.L.[Jian-Long],
Zha, Z.J.[Zheng-Jun],
Luo, J.B.[Jie-Bo],
Learning Conditional Knowledge Distillation for Degraded-Reference
Image Quality Assessment,
ICCV21(10222-10231)
IEEE DOI
2203
Measurement, Image quality, Training, Knowledge engineering,
Computational modeling, Semantics, Image restoration,
Low-level and physics-based vision
BibRef
Liu, L.[Li],
Huang, Q.[Qingle],
Lin, S.[Sihao],
Xie, H.W.[Hong-Wei],
Wang, B.[Bing],
Chang, X.J.[Xiao-Jun],
Liang, X.D.[Xiao-Dan],
Exploring Inter-Channel Correlation for Diversity-preserved Knowledge
Distillation,
ICCV21(8251-8260)
IEEE DOI
2203
Knowledge engineering, Image segmentation, Correlation, Costs,
Semantics, Graphics processing units,
grouping and shape
BibRef
Wang, H.[Hong],
Deng, Y.F.[Yue-Fan],
Yoo, S.[Shinjae],
Ling, H.B.[Hai-Bin],
Lin, Y.W.[Yue-Wei],
AGKD-BML: Defense Against Adversarial Attack by Attention Guided
Knowledge Distillation and Bi-directional Metric Learning,
ICCV21(7638-7647)
IEEE DOI
2203
Training, Deep learning, Codes, Computational modeling,
Neural networks, Bidirectional control, Adversarial learning,
BibRef
Li, C.C.[Cheng-Cheng],
Wang, Z.[Zi],
Qi, H.R.[Hai-Rong],
Online Knowledge Distillation by Temporal-Spatial Boosting,
WACV22(3482-3491)
IEEE DOI
2202
Training, Knowledge engineering,
Benchmark testing, Boosting, Noise measurement,
Deep Learning Deep Learning -> Efficient Training and
Inference Methods for Networks
BibRef
Zheng, Z.Z.[Zhen-Zhu],
Peng, X.[Xi],
Self-Guidance: Improve Deep Neural Network Generalization via
Knowledge Distillation,
WACV22(3451-3460)
IEEE DOI
2202
Training, Deep learning, Knowledge engineering, Measurement,
Visualization, Image recognition, Neural networks,
Learning and Optimization
BibRef
Zhang, H.[Heng],
Fromont, E.[Elisa],
Lefevre, S.[Sébastien],
Avignon, B.[Bruno],
Low-cost Multispectral Scene Analysis with Modality Distillation,
WACV22(3331-3340)
IEEE DOI
2202
Knowledge engineering, Image analysis, Image resolution, Semantics,
Neural networks, Thermal sensors, Predictive models,
Vision Systems and Applications
BibRef
Vo, D.M.[Duc Minh],
Sugimoto, A.[Akihiro],
Nakayama, H.[Hideki],
PPCD-GAN: Progressive Pruning and Class-Aware Distillation for
Large-Scale Conditional GANs Compression,
WACV22(1422-1430)
IEEE DOI
2202
Training, Image coding, Neural network compression,
Computer architecture, GANs
BibRef
Kobayashi, T.[Takumi],
Extractive Knowledge Distillation,
WACV22(1350-1359)
IEEE DOI
2202
Temperature distribution, Analytical models,
Annotations, Transfer learning, Feature extraction, Task analysis,
Deep Learning Object Detection/Recognition/Categorization
BibRef
Nguyen, C.H.[Chuong H.],
Nguyen, T.C.[Thuy C.],
Tang, T.N.[Tuan N.],
Phan, N.L.H.[Nam L. H.],
Improving Object Detection by Label Assignment Distillation,
WACV22(1322-1331)
IEEE DOI
2202
Training, Schedules, Costs, Force, Object detection, Detectors,
Switches, Object Detection/Recognition/Categorization
BibRef
Meng, Z.[Ze],
Yao, X.[Xin],
Sun, L.F.[Li-Feng],
Multi-Task Distillation:
Towards Mitigating the Negative Transfer in Multi-Task Learning,
ICIP21(389-393)
IEEE DOI
2201
Training, Degradation, Image processing, Optimization methods,
Benchmark testing, Turning, Multi-task Learning,
Multi-objective optimization
BibRef
Tang, Q.[Qiankun],
Xu, X.G.[Xiao-Gang],
Wang, J.[Jun],
Differentiable Dynamic Channel Association for Knowledge Distillation,
ICIP21(414-418)
IEEE DOI
2201
Image coding, Computational modeling, Network architecture,
Probabilistic logic, Computational efficiency, Task analysis,
weighted distillation
BibRef
Tran, V.[Vinh],
Wang, Y.[Yang],
Zhang, Z.[Zekun],
Hoai, M.[Minh],
Knowledge Distillation for Human Action Anticipation,
ICIP21(2518-2522)
IEEE DOI
2201
Training, Knowledge engineering, Image processing, Semantics,
Neural networks, Training data
BibRef
Tran, V.[Vinh],
Balasubramanian, N.[Niranjan],
Hoai, M.[Minh],
Progressive Knowledge Distillation for Early Action Recognition,
ICIP21(2583-2587)
IEEE DOI
2201
Knowledge engineering, Training, Recurrent neural networks,
Image recognition, Training data, Semisupervised learning
BibRef
Rotman, M.[Michael],
Wolf, L.B.[Lior B.],
Natural Statistics of Network Activations and Implications for
Knowledge Distillation,
ICIP21(399-403)
IEEE DOI
2201
Deep learning, Knowledge engineering, Image recognition,
Correlation, Semantics, Benchmark testing, Knowledge Distillation,
Image Statistics
BibRef
Banitalebi-Dehkordi, A.[Amin],
Knowledge Distillation for Low-Power Object Detection: A Simple
Technique and Its Extensions for Training Compact Models Using
Unlabeled Data,
LPCV21(769-778)
IEEE DOI
2112
Training, Adaptation models,
Computational modeling, Object detection, Computer architecture
BibRef
Zhu, J.[Jinguo],
Tang, S.X.[Shi-Xiang],
Chen, D.P.[Da-Peng],
Yu, S.J.[Shi-Jie],
Liu, Y.[Yakun],
Rong, M.Z.[Ming-Zhe],
Yang, A.[Aijun],
Wang, X.H.[Xiao-Hua],
Complementary Relation Contrastive Distillation,
CVPR21(9256-9265)
IEEE DOI
2111
Benchmark testing, Pattern recognition, Mutual information
BibRef
Jung, S.[Sangwon],
Lee, D.G.[Dong-Gyu],
Park, T.[Taeeon],
Moon, T.[Taesup],
Fair Feature Distillation for Visual Recognition,
CVPR21(12110-12119)
IEEE DOI
2111
Visualization, Systematics,
Computational modeling, Face recognition, Predictive models,
Prediction algorithms
BibRef
Ghosh, P.[Pallabi],
Saini, N.[Nirat],
Davis, L.S.[Larry S.],
Shrivastava, A.[Abhinav],
Learning Graphs for Knowledge Transfer with Limited Labels,
CVPR21(11146-11156)
IEEE DOI
2111
Training, Visualization, Convolution,
Semisupervised learning, Benchmark testing, Pattern recognition
BibRef
Chen, L.Q.[Li-Qun],
Wang, D.[Dong],
Gan, Z.[Zhe],
Liu, J.J.[Jing-Jing],
Henao, R.[Ricardo],
Carin, L.[Lawrence],
Wasserstein Contrastive Representation Distillation,
CVPR21(16291-16300)
IEEE DOI
2111
Knowledge engineering, Measurement,
Computational modeling, Collaborative work, Robustness, Pattern recognition
BibRef
Huang, Z.[Zhen],
Shen, X.[Xu],
Xing, J.[Jun],
Liu, T.L.[Tong-Liang],
Tian, X.M.[Xin-Mei],
Li, H.Q.[Hou-Qiang],
Deng, B.[Bing],
Huang, J.Q.[Jian-Qiang],
Hua, X.S.[Xian-Sheng],
Revisiting Knowledge Distillation:
An Inheritance and Exploration Framework,
CVPR21(3578-3587)
IEEE DOI
2111
Training, Learning systems, Knowledge engineering, Deep learning,
Neural networks, Reinforcement learning
BibRef
Chen, P.G.[Peng-Guang],
Liu, S.[Shu],
Zhao, H.S.[Heng-Shuang],
Jia, J.Y.[Jia-Ya],
Distilling Knowledge via Knowledge Review,
CVPR21(5006-5015)
IEEE DOI
2111
Knowledge engineering, Object detection, Pattern recognition, Task analysis
BibRef
Ji, M.[Mingi],
Shin, S.J.[Seung-Jae],
Hwang, S.H.[Seung-Hyun],
Park, G.[Gibeom],
Moon, I.C.[Il-Chul],
Refine Myself by Teaching Myself: Feature Refinement via
Self-Knowledge Distillation,
CVPR21(10659-10668)
IEEE DOI
2111
Knowledge engineering, Training, Codes, Semantics,
Neural networks, Object detection
BibRef
Salehi, M.[Mohammadreza],
Sadjadi, N.[Niousha],
Baselizadeh, S.[Soroosh],
Rohban, M.H.[Mohammad H.],
Rabiee, H.R.[Hamid R.],
Multiresolution Knowledge Distillation for Anomaly Detection,
CVPR21(14897-14907)
IEEE DOI
2111
Training, Location awareness, Knowledge engineering,
Image resolution, Pattern recognition, Task analysis
BibRef
Haselhoff, A.[Anselm],
Kronenberger, J.[Jan],
Küppers, F.[Fabian],
Schneider, J.[Jonas],
Towards Black-Box Explainability with Gaussian Discriminant Knowledge
Distillation,
SAIAD21(21-28)
IEEE DOI
2109
Visualization, Shape, Semantics, Training data, Object detection,
Predictive models, Linear programming
BibRef
Yang, L.[Lehan],
Xu, K.[Kele],
Cross Modality Knowledge Distillation for Multi-modal Aerial View
Object Classification,
NTIRE21(382-387)
IEEE DOI
2109
Training, Speckle, Feature extraction, Radar polarimetry,
Data models, Robustness, Pattern recognition
BibRef
Bhat, P.[Prashant],
Arani, E.[Elahe],
Zonooz, B.[Bahram],
Distill on the Go: Online knowledge distillation in self-supervised
learning,
LLID21(2672-2681)
IEEE DOI
2109
Annotations, Computer architecture,
Performance gain, Benchmark testing, Pattern recognition
BibRef
Okuno, T.[Tomoyuki],
Nakata, Y.[Yohei],
Ishii, Y.[Yasunori],
Tsukizawa, S.[Sotaro],
Lossless AI: Toward Guaranteeing Consistency between Inferences
Before and After Quantization via Knowledge Distillation,
MVA21(1-5)
DOI Link
2109
Training, Quality assurance, Quantization (signal),
Object detection, Network architecture, Real-time systems
BibRef
Nayak, G.K.[Gaurav Kumar],
Mopuri, K.R.[Konda Reddy],
Chakraborty, A.[Anirban],
Effectiveness of Arbitrary Transfer Sets for Data-free Knowledge
Distillation,
WACV21(1429-1437)
IEEE DOI
2106
Training, Visualization, Sensitivity, Computational modeling,
Semantics, Neural networks, Training data
BibRef
Lee, J.[Jongmin],
Jeong, Y.[Yoonwoo],
Kim, S.[Seungwook],
Min, J.[Juhong],
Cho, M.[Minsu],
Learning to Distill Convolutional Features into Compact Local
Descriptors,
WACV21(897-907)
IEEE DOI
2106
Location awareness, Visualization, Image matching, Semantics,
Benchmark testing, Feature extraction, Robustness
BibRef
Arani, E.[Elahe],
Sarfraz, F.[Fahad],
Zonooz, B.[Bahram],
Noise as a Resource for Learning in Knowledge Distillation,
WACV21(3128-3137)
IEEE DOI
2106
Training, Uncertainty, Neuroscience, Collaboration,
Collaborative work, Brain modeling, Probabilistic logic
BibRef
Chawla, A.[Akshay],
Yin, H.X.[Hong-Xu],
Molchanov, P.[Pavlo],
Alvarez, J.[Jose],
Data-free Knowledge Distillation for Object Detection,
WACV21(3288-3297)
IEEE DOI
2106
Knowledge engineering, Training, Image synthesis,
Neural networks, Object detection
BibRef
Kothandaraman, D.[Divya],
Nambiar, A.[Athira],
Mittal, A.[Anurag],
Domain Adaptive Knowledge Distillation for Driving Scene Semantic
Segmentation,
WACVW21(134-143) Autonomous Vehicle Vision
IEEE DOI
2105
Knowledge engineering, Adaptation models, Image segmentation,
Semantics, Memory management
BibRef
Kushawaha, R.K.[Ravi Kumar],
Kumar, S.[Saurabh],
Banerjee, B.[Biplab],
Velmurugan, R.[Rajbabu],
Distilling Spikes: Knowledge Distillation in Spiking Neural Networks,
ICPR21(4536-4543)
IEEE DOI
2105
Knowledge engineering, Training, Image coding,
Computational modeling, Artificial neural networks,
Hardware
BibRef
Sarfraz, F.[Fahad],
Arani, E.[Elahe],
Zonooz, B.[Bahram],
Knowledge Distillation Beyond Model Compression,
ICPR21(6136-6143)
IEEE DOI
2105
Training, Knowledge engineering, Neural networks,
Network architecture, Collaborative work, Robustness
BibRef
Ahmed, W.[Waqar],
Zunino, A.[Andrea],
Morerio, P.[Pietro],
Murino, V.[Vittorio],
Compact CNN Structure Learning by Knowledge Distillation,
ICPR21(6554-6561)
IEEE DOI
2105
Training, Learning systems, Knowledge engineering,
Network architecture, Predictive models
BibRef
Ma, J.X.[Jia-Xin],
Yonetani, R.[Ryo],
Iqbal, Z.[Zahid],
Adaptive Distillation for Decentralized Learning from Heterogeneous
Clients,
ICPR21(7486-7492)
IEEE DOI
2105
Learning systems, Adaptation models, Visualization,
Biomedical equipment, Medical services, Collaborative work, Data models
BibRef
Xu, Y.[Yi],
Pu, J.[Jian],
Zhao, H.[Hui],
Knowledge Distillation with a Precise Teacher and Prediction with
Abstention,
ICPR21(9000-9006)
IEEE DOI
2105
Knowledge engineering, Supervised learning, Benchmark testing,
Predictive models
BibRef
Tsunashima, H.[Hideki],
Kataoka, H.[Hirokatsu],
Yamato, J.J.[Jun-Ji],
Chen, Q.[Qiu],
Morishima, S.[Shigeo],
Adversarial Knowledge Distillation for a Compact Generator,
ICPR21(10636-10643)
IEEE DOI
2105
Training, Image resolution, MIMICs, Generators
BibRef
Zhang, Z.X.[Zhe-Xi],
Zhu, W.[Wei],
Yan, J.C.[Jun-Chi],
Gao, P.[Peng],
Xie, G.T.[Guo-Tong],
Automatic Student Network Search for Knowledge Distillation,
ICPR21(2446-2453)
IEEE DOI
2105
Knowledge engineering, Performance evaluation,
Computational modeling, Bit error rate, Neural networks,
Natural language processing
BibRef
Kim, J.H.[Jang-Ho],
Hyun, M.S.[Min-Sung],
Chung, I.[Inseop],
Kwak, N.[Nojun],
Feature Fusion for Online Mutual Knowledge Distillation,
ICPR21(4619-4625)
IEEE DOI
2105
Neural networks, Education, Performance gain, Pattern recognition
BibRef
Mitsuno, K.[Kakeru],
Nomura, Y.[Yuichiro],
Kurita, T.[Takio],
Channel Planting for Deep Neural Networks using Knowledge
Distillation,
ICPR21(7573-7579)
IEEE DOI
2105
Training, Knowledge engineering,
Heuristic algorithms, Neural networks, Computer architecture,
Network architecture
BibRef
Finogeev, E.,
Gorbatsevich, V.,
Moiseenko, A.,
Vizilter, Y.,
Vygolov, O.,
Knowledge Distillation Using GANs for Fast Object Detection,
ISPRS20(B2:583-588).
DOI Link
2012
BibRef
Sadhukhan, R.,
Saha, A.,
Mukhopadhyay, J.,
Patra, A.,
Knowledge Distillation Inspired Fine-Tuning Of Tucker Decomposed CNNS
and Adversarial Robustness Analysis,
ICIP20(1876-1880)
IEEE DOI
2011
Robustness, Knowledge engineering, Convolution, Tensile stress,
Neural networks, Perturbation methods, Acceleration,
Adversarial Robustness
BibRef
Cui, W.,
Li, X.,
Huang, J.,
Wang, W.,
Wang, S.,
Chen, J.,
Substitute Model Generation for Black-Box Adversarial Attack Based on
Knowledge Distillation,
ICIP20(648-652)
IEEE DOI
2011
Perturbation methods, Task analysis, Training,
Computational modeling, Approximation algorithms,
black-box models
BibRef
Xu, K.R.[Kun-Ran],
Rui, L.[Lai],
Li, Y.S.[Yi-Shi],
Gu, L.[Lin],
Feature Normalized Knowledge Distillation for Image Classification,
ECCV20(XXV:664-680).
Springer DOI
2011
BibRef
Yang, Y.,
Qiu, J.,
Song, M.,
Tao, D.,
Wang, X.,
Distilling Knowledge From Graph Convolutional Networks,
CVPR20(7072-7081)
IEEE DOI
2008
Knowledge engineering, Task analysis,
Computational modeling, Computer science, Training, Neural networks
BibRef
Yun, J.S.[Ju-Seung],
Kim, B.[Byungjoo],
Kim, J.[Junmo],
Weight Decay Scheduling and Knowledge Distillation for Active Learning,
ECCV20(XXVI:431-447).
Springer DOI
2011
BibRef
Li, C.L.[Chang-Lin],
Tang, T.[Tao],
Wang, G.[Guangrun],
Peng, J.[Jiefeng],
Wang, B.[Bing],
Liang, X.D.[Xiao-Dan],
Chang, X.J.[Xiao-Jun],
BossNAS: Exploring Hybrid CNN-transformers with Block-wisely
Self-supervised Neural Architecture Search,
ICCV21(12261-12271)
IEEE DOI
2203
Training, Visualization, Correlation, Architecture,
Computational modeling, Sociology, Computer architecture,
Representation learning
BibRef
Li, C.L.[Chang-Lin],
Peng, J.F.[Jie-Feng],
Yuan, L.C.[Liu-Chun],
Wang, G.R.[Guang-Run],
Liang, X.D.[Xiao-Dan],
Lin, L.[Liang],
Chang, X.J.[Xiao-Jun],
Block-Wisely Supervised Neural Architecture Search With Knowledge
Distillation,
CVPR20(1986-1995)
IEEE DOI
2008
Computer architecture, Network architecture,
Knowledge engineering, Training, DNA, Convergence, Feature extraction
BibRef
Wei, L.H.[Long-Hui],
Xiao, A.[An],
Xie, L.X.[Ling-Xi],
Zhang, X.P.[Xiao-Peng],
Chen, X.[Xin],
Tian, Q.[Qi],
Circumventing Outliers of Autoaugment with Knowledge Distillation,
ECCV20(III:608-625).
Springer DOI
2012
BibRef
Walawalkar, D.[Devesh],
Shen, Z.Q.[Zhi-Qiang],
Savvides, M.[Marios],
Online Ensemble Model Compression Using Knowledge Distillation,
ECCV20(XIX:18-35).
Springer DOI
2011
BibRef
Xiang, L.Y.[Liu-Yu],
Ding, G.G.[Gui-Guang],
Han, J.G.[Jun-Gong],
Learning From Multiple Experts: Self-paced Knowledge Distillation for
Long-tailed Classification,
ECCV20(V:247-263).
Springer DOI
2011
BibRef
Zhou, B.[Brady],
Kalra, N.[Nimit],
Krähenbühl, P.[Philipp],
Domain Adaptation Through Task Distillation,
ECCV20(XXVI:664-680).
Springer DOI
2011
BibRef
Li, Z.[Zheng],
Huang, Y.[Ying],
Chen, D.F.[De-Fang],
Luo, T.[Tianren],
Cai, N.[Ning],
Pan, Z.G.[Zhi-Geng],
Online Knowledge Distillation via Multi-branch Diversity Enhancement,
ACCV20(IV:318-333).
Springer DOI
2103
BibRef
Ye, H.J.[Han-Jia],
Lu, S.[Su],
Zhan, D.C.[De-Chuan],
Distilling Cross-Task Knowledge via Relationship Matching,
CVPR20(12393-12402)
IEEE DOI
2008
Task analysis, Neural networks, Training, Knowledge engineering,
Predictive models, Stochastic processes, Temperature measurement
BibRef
Yao, A.B.[An-Bang],
Sun, D.W.[Da-Wei],
Knowledge Transfer via Dense Cross-layer Mutual-distillation,
ECCV20(XV:294-311).
Springer DOI
2011
BibRef
Yue, K.Y.[Kai-Yu],
Deng, J.F.[Jiang-Fan],
Zhou, F.[Feng],
Matching Guided Distillation,
ECCV20(XV:312-328).
Springer DOI
2011
BibRef
Zhang, Y.C.[You-Cai],
Lan, Z.H.[Zhong-Hao],
Dai, Y.C.[Yu-Chen],
Zeng, F.G.[Fan-Gao],
Bai, Y.[Yan],
Chang, J.[Jie],
Wei, Y.C.[Yi-Chen],
Prime-aware Adaptive Distillation,
ECCV20(XIX:658-674).
Springer DOI
2011
Student-Teacher learning.
BibRef
Xu, G.D.[Guo-Dong],
Liu, Z.W.[Zi-Wei],
Li, X.X.[Xiao-Xiao],
Loy, C.C.[Chen Change],
Knowledge Distillation Meets Self-Supervision,
ECCV20(IX:588-604).
Springer DOI
2011
Extracting the dark knowledge from a teacher network to guide
the learning of a student network, for transfer learning.
BibRef
Li, X.J.[Xiao-Jie],
Wu, J.L.[Jian-Long],
Fang, H.Y.[Hong-Yu],
Liao, Y.[Yue],
Wang, F.[Fei],
Qian, C.[Chen],
Local Correlation Consistency for Knowledge Distillation,
ECCV20(XII: 18-33).
Springer DOI
2010
Knowledge extraction from the teacher network plays a critical role in
the knowledge distillation task to improve the performance of the
student network.
BibRef
Passalis, N.[Nikolaos],
Tzelepi, M.[Maria],
Tefas, A.[Anastasios],
Heterogeneous Knowledge Distillation Using Information Flow Modeling,
CVPR20(2336-2345)
IEEE DOI
2008
From complex teacher to smaller student.
Training, Neural networks, Knowledge engineering, Data models,
Convergence, Data mining, Transforms
BibRef
Chen, Z.L.[Zai-Liang],
Zheng, X.X.[Xian-Xian],
Shen, H.L.[Hai-Lan],
Zeng, Z.Y.[Zi-Yang],
Zhou, Y.K.[Yu-Kun],
Zhao, R.C.[Rong-Chang],
Improving Knowledge Distillation via Category Structure,
ECCV20(XXVIII:205-219).
Springer DOI
2011
Training student to mimic the teacher, but not capture the structure.
BibRef
Wang, D.Y.[De-Yu],
Wen, D.[Dongchao],
Liu, J.J.[Jun-Jie],
Tao, W.[Wei],
Chen, T.W.[Tse-Wei],
Osa, K.[Kinya],
Kato, M.[Masami],
Fully Supervised and Guided Distillation for One-stage Detectors,
ACCV20(III:171-188).
Springer DOI
2103
BibRef
Itsumi, H.,
Beye, F.,
Shinohara, Y.,
Iwai, T.,
Training With Cache:
Specializing Object Detectors From Live Streams Without Overfitting,
ICIP20(1976-1980)
IEEE DOI
2011
Training, Data models, Solid modeling, Adaptation models,
Training data, Streaming media, Legged locomotion, Online training,
Knowledge distillation
BibRef
Liu, B.L.[Ben-Lin],
Rao, Y.M.[Yong-Ming],
Lu, J.W.[Ji-Wen],
Zhou, J.[Jie],
Hsieh, C.J.[Cho-Jui],
Metadistiller:
Network Self-boosting via Meta-learned Top-down Distillation,
ECCV20(XIV:694-709).
Springer DOI
2011
BibRef
Choi, Y.,
Choi, J.,
El-Khamy, M.,
Lee, J.,
Data-Free Network Quantization With Adversarial Knowledge
Distillation,
EDLCV20(3047-3057)
IEEE DOI
2008
Generators, Quantization (signal), Training,
Computational modeling, Data models, Machine learning, Data privacy
BibRef
de Vieilleville, F.,
Lagrange, A.,
Ruiloba, R.,
May, S.,
Towards Distillation of Deep Neural Networks for Satellite On-board
Image Segmentation,
ISPRS20(B2:1553-1559).
DOI Link
2012
BibRef
Wang, X.B.[Xiao-Bo],
Fu, T.Y.[Tian-Yu],
Liao, S.C.[Sheng-Cai],
Wang, S.[Shuo],
Lei, Z.[Zhen],
Mei, T.[Tao],
Exclusivity-Consistency Regularized Knowledge Distillation for Face
Recognition,
ECCV20(XXIV:325-342).
Springer DOI
2012
BibRef
Guan, Y.S.[Yu-Shuo],
Zhao, P.Y.[Peng-Yu],
Wang, B.X.[Bing-Xuan],
Zhang, Y.X.[Yuan-Xing],
Yao, C.[Cong],
Bian, K.G.[Kai-Gui],
Tang, J.[Jian],
Differentiable Feature Aggregation Search for Knowledge Distillation,
ECCV20(XVII:469-484).
Springer DOI
2011
BibRef
Gu, J.D.[Jin-Dong],
Wu, Z.L.[Zhi-Liang],
Tresp, V.[Volker],
Introspective Learning by Distilling Knowledge from Online
Self-explanation,
ACCV20(IV:36-52).
Springer DOI
2103
BibRef
Guo, Q.S.[Qiu-Shan],
Wang, X.J.[Xin-Jiang],
Wu, Y.C.[Yi-Chao],
Yu, Z.P.[Zhi-Peng],
Liang, D.[Ding],
Hu, X.L.[Xiao-Lin],
Luo, P.[Ping],
Online Knowledge Distillation via Collaborative Learning,
CVPR20(11017-11026)
IEEE DOI
2008
Knowledge engineering, Training, Collaborative work,
Perturbation methods, Collaboration, Neural networks, Logic gates
BibRef
Li, T.,
Li, J.,
Liu, Z.,
Zhang, C.,
Few Sample Knowledge Distillation for Efficient Network Compression,
CVPR20(14627-14635)
IEEE DOI
2008
Training, Tensile stress, Knowledge engineering, Convolution,
Neural networks, Computational modeling, Standards
BibRef
Wang, D.,
Li, Y.,
Wang, L.,
Gong, B.,
Neural Networks Are More Productive Teachers Than Human Raters:
Active Mixup for Data-Efficient Knowledge Distillation From a
Blackbox Model,
CVPR20(1495-1504)
IEEE DOI
2008
Neural networks, Computational modeling, Data models, Training,
Knowledge engineering, Visualization, Manifolds
BibRef
Farhadi, M.[Mohammad],
Yang, Y.Z.[Ye-Zhou],
TKD: Temporal Knowledge Distillation for Active Perception,
WACV20(942-951)
IEEE DOI
2006
Code, Object Detection.
WWW Link. Temporal knowledge over NN applied over multiple frames.
Adaptation models, Object detection, Visualization,
Computational modeling, Task analysis, Training, Feature extraction
BibRef
Seddik, M.E.A.,
Essafi, H.,
Benzine, A.,
Tamaazousti, M.,
Lightweight Neural Networks From PCA LDA Based Distilled Dense Neural
Networks,
ICIP20(3060-3064)
IEEE DOI
2011
Neural networks, Principal component analysis,
Computational modeling, Training, Machine learning,
Lightweight Networks
BibRef
Tung, F.[Fred],
Mori, G.[Greg],
Similarity-Preserving Knowledge Distillation,
ICCV19(1365-1374)
IEEE DOI
2004
learning (artificial intelligence), neural nets,
semantic networks, Task analysis
BibRef
Zhang, M.Y.[Man-Yuan],
Song, G.L.[Guang-Lu],
Zhou, H.[Hang],
Liu, Y.[Yu],
Discriminability Distillation in Group Representation Learning,
ECCV20(X:1-19).
Springer DOI
2011
BibRef
Jin, X.[Xiao],
Peng, B.Y.[Bao-Yun],
Wu, Y.C.[Yi-Chao],
Liu, Y.[Yu],
Liu, J.H.[Jia-Heng],
Liang, D.[Ding],
Yan, J.J.[Jun-Jie],
Hu, X.L.[Xiao-Lin],
Knowledge Distillation via Route Constrained Optimization,
ICCV19(1345-1354)
IEEE DOI
2004
face recognition, image classification,
learning (artificial intelligence), neural nets, optimisation,
Neural networks
BibRef
Mullapudi, R.T.,
Chen, S.,
Zhang, K.,
Ramanan, D.,
Fatahalian, K.,
Online Model Distillation for Efficient Video Inference,
ICCV19(3572-3581)
IEEE DOI
2004
convolutional neural nets, image segmentation,
inference mechanisms, learning (artificial intelligence),
Cameras
BibRef
Zhang, L.,
Song, J.,
Gao, A.,
Chen, J.,
Bao, C.,
Ma, K.,
Be Your Own Teacher: Improve the Performance of Convolutional Neural
Networks via Self Distillation,
ICCV19(3712-3721)
IEEE DOI
2004
convolutional neural nets, learning (artificial intelligence),
knowledge distillation, student neural networks,
Computational modeling
BibRef
Cho, J.H.,
Hariharan, B.,
On the Efficacy of Knowledge Distillation,
ICCV19(4793-4801)
IEEE DOI
2004
learning (artificial intelligence), neural nets, Probability distribution,
teacher architectures, knowledge distillation performance.
BibRef
Peng, B.,
Jin, X.,
Li, D.,
Zhou, S.,
Wu, Y.,
Liu, J.,
Zhang, Z.,
Liu, Y.,
Correlation Congruence for Knowledge Distillation,
ICCV19(5006-5015)
IEEE DOI
2004
correlation methods, face recognition, image classification,
learning (artificial intelligence), instance-level information,
Knowledge transfer
BibRef
Vongkulbhisal, J.[Jayakorn],
Vinayavekhin, P.[Phongtharin],
Visentini-Scarzanella, M.[Marco],
Unifying Heterogeneous Classifiers With Distillation,
CVPR19(3170-3179).
IEEE DOI
2002
BibRef
Yan, M.,
Zhao, M.,
Xu, Z.,
Zhang, Q.,
Wang, G.,
Su, Z.,
VarGFaceNet: An Efficient Variable Group Convolutional Neural Network
for Lightweight Face Recognition,
LFR19(2647-2654)
IEEE DOI
2004
Code, Face Recognition.
WWW Link. convolutional neural nets, face recognition,
learning (artificial intelligence), student model, teacher model,
knowledge distillation
BibRef
Yoshioka, K.,
Lee, E.,
Wong, S.,
Horowitz, M.,
Dataset Culling: Towards Efficient Training of Distillation-Based
Domain Specific Models,
ICIP19(3237-3241)
IEEE DOI
1910
Object Detection, Training Efficiency, Distillation,
Dataset Culling, Deep Learning
BibRef
Yang, C.L.[Cheng-Lin],
Xie, L.X.[Ling-Xi],
Su, C.[Chi],
Yuille, A.L.[Alan L.],
Snapshot Distillation: Teacher-Student Optimization in One Generation,
CVPR19(2854-2863).
IEEE DOI
2002
BibRef
Kundu, J.N.,
Lakkakula, N.,
Radhakrishnan, V.B.,
UM-Adapt: Unsupervised Multi-Task Adaptation Using Adversarial
Cross-Task Distillation,
ICCV19(1436-1445)
IEEE DOI
2004
generalisation (artificial intelligence), image classification,
object detection, unsupervised learning, task-transferability,
Adaptation models
BibRef
Park, W.[Wonpyo],
Kim, D.J.[Dong-Ju],
Lu, Y.[Yan],
Cho, M.[Minsu],
Relational Knowledge Distillation,
CVPR19(3962-3971).
IEEE DOI
2002
BibRef
Liu, Y.F.[Yu-Fan],
Cao, J.J.[Jia-Jiong],
Li, B.[Bing],
Yuan, C.F.[Chun-Feng],
Hu, W.M.[Wei-Ming],
Li, Y.X.[Yang-Xi],
Duan, Y.Q.[Yun-Qiang],
Knowledge Distillation via Instance Relationship Graph,
CVPR19(7089-7097).
IEEE DOI
2002
BibRef
Ahn, S.S.[Sung-Soo],
Hu, S.X.[Shell Xu],
Damianou, A.[Andreas],
Lawrence, N.D.[Neil D.],
Dai, Z.W.[Zhen-Wen],
Variational Information Distillation for Knowledge Transfer,
CVPR19(9155-9163).
IEEE DOI
2002
BibRef
Minami, S.[Soma],
Yamashita, T.[Takayoshi],
Fujiyoshi, H.[Hironobu],
Gradual Sampling Gate for Bidirectional Knowledge Distillation,
MVA19(1-6)
DOI Link
1911
Transfer knowledge from large pre-trained network to smaller one.
data compression, learning (artificial intelligence),
neural nets, gradual sampling gate,
Power markets
BibRef
Chen, W.C.[Wei-Chun],
Chang, C.C.[Chia-Che],
Lee, C.R.[Che-Rung],
Knowledge Distillation with Feature Maps for Image Classification,
ACCV18(III:200-215).
Springer DOI
1906
BibRef
Hou, S.H.[Sai-Hui],
Pan, X.Y.[Xin-Yu],
Loy, C.C.[Chen Change],
Wang, Z.L.[Zi-Lei],
Lin, D.H.[Da-Hua],
Lifelong Learning via Progressive Distillation and Retrospection,
ECCV18(III: 452-467).
Springer DOI
1810
BibRef
Pintea, S.L.[Silvia L.],
Liu, Y.[Yue],
van Gemert, J.C.[Jan C.],
Recurrent Knowledge Distillation,
ICIP18(3393-3397)
IEEE DOI
1809
small network learns from larger network.
Computational modeling, Memory management, Training, Color,
Convolution, Road transportation, Knowledge distillation,
recurrent layers
BibRef
Lee, S.H.[Seung Hyun],
Kim, D.H.[Dae Ha],
Song, B.C.[Byung Cheol],
Self-supervised Knowledge Distillation Using Singular Value
Decomposition,
ECCV18(VI: 339-354).
Springer DOI
1810
BibRef
Yim, J.,
Joo, D.,
Bae, J.,
Kim, J.,
A Gift from Knowledge Distillation: Fast Optimization, Network
Minimization and Transfer Learning,
CVPR17(7130-7138)
IEEE DOI
1711
Feature extraction, Knowledge engineering,
Knowledge transfer, Optimization, Training
BibRef
Gupta, S.[Saurabh],
Hoffman, J.[Judy],
Malik, J.[Jitendra],
Cross Modal Distillation for Supervision Transfer,
CVPR16(2827-2836)
IEEE DOI
1612
BibRef
Chapter on Matching and Recognition Using Volumes, High Level Vision Techniques, Invariants continues in
Explainable Aritficial Intelligence .