Yu, R.N.[Ruo-Nan],
Liu, S.H.[Song-Hua],
Wang, X.C.[Xin-Chao],
Dataset Distillation: A Comprehensive Review,
PAMI(46), No. 1, January 2024, pp. 150-170.
IEEE DOI
2312
Dataset condensation. Reduse to what matters.
BibRef
Lei, S.[Shiye],
Tao, D.C.[Da-Cheng],
A Comprehensive Survey of Dataset Distillation,
PAMI(46), No. 1, January 2024, pp. 17-32.
IEEE DOI
2312
BibRef
van Noord, N.[Nanne],
Prototype-based Dataset Comparison,
ICCV23(1944-1954)
IEEE DOI Code:
WWW Link.
2401
BibRef
Sajedi, A.[Ahmad],
Khaki, S.[Samir],
Amjadian, E.[Ehsan],
Liu, L.Z.[Lucy Z.],
Lawryshyn, Y.A.[Yuri A.],
Plataniotis, K.N.[Konstantinos N.],
DataDAM: Efficient Dataset Distillation with Attention Matching,
ICCV23(17051-17061)
IEEE DOI
2401
BibRef
Zhou, D.[Daquan],
Wang, K.[Kai],
Gu, J.Y.[Jian-Yang],
Peng, X.Y.[Xiang-Yu],
Lian, D.Z.[Dong-Ze],
Zhang, Y.F.[Yi-Fan],
You, Y.[Yang],
Feng, J.S.[Jia-Shi],
Dataset Quantization,
ICCV23(17159-17170)
IEEE DOI
2401
BibRef
Liu, Y.Q.[Yan-Qing],
Gu, J.Y.[Jian-Yang],
Wang, K.[Kai],
Zhu, Z.[Zheng],
Jiang, W.[Wei],
You, Y.[Yang],
DREAM: Efficient Dataset Distillation by Representative Matching,
ICCV23(17268-17278)
IEEE DOI
2401
BibRef
Liu, S.[Songhua],
Wang, X.C.[Xin-Chao],
Few-Shot Dataset Distillation via Translative Pre-Training,
ICCV23(18608-18618)
IEEE DOI
2401
BibRef
Mazumder, A.[Alokendu],
Baruah, T.[Tirthajit],
Singh, A.K.[Akash Kumar],
Murthy, P.K.[Pagadala Krishna],
Pattanaik, V.[Vishwajeet],
Rathore, P.[Punit],
DeepVAT: A Self-Supervised Technique for Cluster Assessment in Image
Datasets,
VIPriors23(187-195)
IEEE DOI
2401
BibRef
Zhang, L.[Lei],
Zhang, J.[Jie],
Lei, B.[Bowen],
Mukherjee, S.[Subhabrata],
Pan, X.[Xiang],
Zhao, B.[Bo],
Ding, C.[Caiwen],
Li, Y.[Yao],
Xu, D.[Dongkuan],
Accelerating Dataset Distillation via Model Augmentation,
CVPR23(11950-11959)
IEEE DOI
2309
smaller but efficient synthetic training datasets from large ones
BibRef
Cazenavette, G.[George],
Wang, T.Z.[Tong-Zhou],
Torralba, A.[Antonio],
Efros, A.A.[Alexei A.],
Zhu, J.Y.[Jun-Yan],
Generalizing Dataset Distillation via Deep Generative Prior,
CVPR23(3739-3748)
IEEE DOI
2309
BibRef
Wang, Z.J.[Zi-Jia],
Yang, W.B.[Wen-Bin],
Liu, Z.S.[Zhi-Song],
Chen, Q.[Qiang],
Ni, J.C.[Jia-Cheng],
Jia, Z.[Zhen],
Gift from Nature:
Potential Energy Minimization for Explainable Dataset Distillation,
MLCSA22(240-255).
Springer DOI
2307
BibRef
Cazenavette, G.[George],
Wang, T.Z.[Tong-Zhou],
Torralba, A.[Antonio],
Efros, A.A.[Alexei A.],
Zhu, J.Y.[Jun-Yan],
Dataset Distillation by Matching Training Trajectories,
CVPR22(10708-10717)
IEEE DOI
2210
BibRef
Earlier:
VDU22(4749-4758)
IEEE DOI
2210
Training, Visualization, Trajectory, Task analysis,
Unsupervised learning, Pattern matching,
Self- semi- meta- unsupervised learning
Training, Visualization, Trajectory, Task analysis, Pattern matching
BibRef
Chapter on Matching and Recognition Using Volumes, High Level Vision Techniques, Invariants continues in
Fine Tuning, Fine-Tuning, Pre-Training, Zero-Shot, One-Shot .