Το work with title Nonnegative tensor completion: step-sizes for an accelerated variation of the stochastic gradient descent by Liavas Athanasios, Papagiannakos Ioannis-Marios, Kolomvakis Christos is licensed under Creative Commons Attribution 4.0 International
Bibliographic Citation
A. P. Liavas, I. M. Papagiannakos and C. Kolomvakis, "Nonnegative tensor completion: step-sizes for an accelerated variation of the stochastic gradient descent," in Proceedings of the 30th European Signal Processing Conference (EUSIPCO 2022), Belgrade, Serbia, 2022, pp. 1976-1980, doi: 10.23919/EUSIPCO55093.2022.9909736.
https://doi.org/10.23919/EUSIPCO55093.2022.9909736
We consider the problem of nonnegative tensor completion. We adopt the alternating optimization framework and solve each nonnegative matrix least-squares problem via an accelerated variation of the stochastic gradient descent. The step-sizes used by the algorithm determine, to a high extent, its behavior. We propose two new strategies for the computation of step-sizes and we experimentally test their effectiveness using both synthetic and real-world data.