Improved Adam: Incorporating Unified Conformable Fractional Derivative for fractional-order Momentum
Main Article Content
Abstract
Deep neural networks (DNNs) are closely tied to the training algorithm, and the performance of training process relies heavily on the choice of optimizer. The neural network optimizer is an algorithm used in deep learning (DL) to adjust the parameters of network to minimize the loss function. Currently, Adam is one of the most popular optimizers for its stability and efficiency. In recent years, there has been growing interest in exploring the use of fractional-order momentum, which offers greater flexibility compared to integer-order momentum and has shown promising performance in updating deep network parameters. One emerging method in fractional-order calculus is the Unified Conformable Fractional Derivative (UCFD), which has attracted extensive research attention. Motivated by this, this paper introduces an enhanced Adam optimizer incorporating unified conformable fractional-order momentum, referred to as the UCAdam. This method is trained and compared with the integer-order Adam using popular models and datasets for image classification tasks. The experiments indicate that the proposed optimizer achieves effective convergence and outperforms Adam in terms of performance.
Article Details
This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.