Skip to main content

Research Repository

Advanced Search

Towards Faster Training Algorithms Exploiting Bandit Sampling From Convex to Strongly Convex Conditions

Zhou, Yangfan; Huang, Kaizhu; Cheng, Cheng; Wang, Xuguang; Hussain, Amir; Liu, Xin

Authors

Yangfan Zhou

Kaizhu Huang

Cheng Cheng

Xuguang Wang

Xin Liu



Abstract

The training process for deep learning and pattern recognition normally involves the use of convex and strongly convex optimization algorithms such as AdaBelief and SAdam to handle lots of “uninformative” samples that should be ignored, thus incurring extra calculations. To solve this open problem, we propose to design bandit sampling method to make these algorithms focus on “informative” samples during training process. Our contribution is twofold: first, we propose a convex optimization algorithm with bandit sampling, termed AdaBeliefBS, and prove that it converges faster than its original version; second, we prove that bandit sampling works well for strongly convex algorithms, and propose a generalized SAdam, called SAdamBS, that converges faster than SAdam. Finally, we conduct a series of experiments on various benchmark datasets to verify the fast convergence rate of our proposed algorithms.

Journal Article Type Article
Acceptance Date Apr 23, 2022
Online Publication Date May 23, 2022
Publication Date 2023-04
Deposit Date Jul 8, 2022
Publicly Available Date Jul 8, 2022
Journal IEEE Transactions on Emerging Topics in Computational Intelligence
Print ISSN 2471-285X
Publisher Institute of Electrical and Electronics Engineers
Peer Reviewed Peer Reviewed
Volume 7
Issue 2
Pages 565-577
DOI https://doi.org/10.1109/tetci.2022.3171797
Keywords Bandit sampling, convex optimization, image processing, training algorithm
Public URL http://researchrepository.napier.ac.uk/Output/2885397

Files

Towards Faster Training Algorithms Exploiting Bandit Sampling From Convex to Strongly Convex Conditions (accepted version) (5.2 Mb)
PDF




You might also like



Downloadable Citations