This presentation, aimed at a broad spectrum of researchers, highlights the latest advancements in the acceleration of first-order optimization algorithms, a field that currently attracts the attention of many research teams worldwide. First-order methods, such as gradient descent or stochastic gradient, have gained significant popularity. The seminal development in this area can be attributed to Yurii Nesterov, who proposed a class of accelerated gradient methods in 1983 that demonstrated faster global convergence rates than gradient descent.