Theoretical Framework for Tempered Fractional Gradient Descent: Application to Breast Cancer Classification
–arXiv.org Artificial Intelligence
This paper introduces Tempered Fractional Gradient Descent (TFGD), a novel optimization framework that synergizes fractional calculus with exponential tempering to enhance gradient-based learning. Traditional gradient descent methods often suffer from oscillatory updates and slow convergence in high-dimensional, noisy landscapes. TFGD addresses these limitations by incorporating a tempered memory mechanism, where historical gradients are weighted by fractional coefficients $|w_j| = \binomα{j}$ and exponentially decayed via a tempering parameter $λ$. Theoretical analysis establishes TFGD's convergence guarantees: in convex settings, it achieves an $\mathcal{O}(1/K)$ rate with alignment coefficient $d_{α,λ} = (1 - e^{-λ})^{-α}$, while stochastic variants attain $\mathcal{O}(1/k^α)$ error decay. The algorithm maintains $\mathcal{O}(n)$ time complexity equivalent to SGD, with memory overhead scaling as $\mathcal{O}(d/λ)$ for parameter dimension $d$. Empirical validation on the Breast Cancer Wisconsin dataset demonstrates TFGD's superiority, achieving 98.25\% test accuracy (vs. 92.11\% for SGD) and 2$\times$ faster convergence. The tempered memory mechanism proves particularly effective in medical classification tasks, where feature correlations benefit from stable gradient averaging. These results position TFGD as a robust alternative to conventional optimizers in both theoretical and applied machine learning.
arXiv.org Artificial Intelligence
Apr-29-2025
- Country:
- Africa > Middle East
- Tunisia > Kairouan Governorate > Kairouan (0.05)
- North America > United States
- Wisconsin (0.26)
- Africa > Middle East
- Genre:
- Instructional Material > Course Syllabus & Notes (0.34)
- Research Report (0.50)
- Industry:
- Health & Medicine (0.59)
- Technology: