Follow
Anton Rodomanov
Anton Rodomanov
Verified email at uclouvain.be
Title
Cited by
Cited by
Year
Putting MRFs on a tensor train
A Novikov, A Rodomanov, A Osokin, D Vetrov
International Conference on Machine Learning, 811-819, 2014
502014
Greedy quasi-Newton methods with explicit superlinear convergence
A Rodomanov, Y Nesterov
SIAM Journal on Optimization 31 (1), 785-811, 2021
432021
Rates of superlinear convergence for classical quasi-Newton methods
A Rodomanov, Y Nesterov
Mathematical Programming, 1-32, 2021
422021
A superlinearly-convergent proximal Newton-type method for the optimization of finite sums
A Rodomanov, D Kropotov
International Conference on Machine Learning, 2597-2605, 2016
422016
New Results on Superlinear Convergence of Classical Quasi-Newton Methods
A Rodomanov, Y Nesterov
Journal of Optimization Theory and Applications 188, 744-769, 2021
412021
Primal-dual method for searching equilibrium in hierarchical congestion population games
P Dvurechensky, A Gasnikov, E Gasnikova, S Matsievsky, A Rodomanov, ...
arXiv preprint arXiv:1606.08988, 2016
372016
A randomized coordinate descent method with volume sampling
A Rodomanov, D Kropotov
SIAM Journal on Optimization 30 (3), 1878-1904, 2020
92020
Smoothness parameter of power of Euclidean norm
A Rodomanov, Y Nesterov
Journal of Optimization Theory and Applications 185, 303-326, 2020
72020
Subgradient ellipsoid method for nonsmooth convex problems
A Rodomanov, Y Nesterov
Mathematical Programming 199 (1-2), 305-341, 2023
22023
Quasi-Newton Methods with Provable Efficiency Guarantees
A Rodomanov
PhD thesis, UCL-Université Catholique de Louvain, 2022
12022
Polynomial Preconditioning for Gradient Methods
N Doikov, A Rodomanov
arXiv preprint arXiv:2301.13194, 2023
2023
Randomized Minimization of Eigenvalue Functions
Y Nesterov, A Rodomanov
arXiv preprint arXiv:2301.08352, 2023
2023
Linear Coupling of Gradient and Mirror Descent: Version for Composite Functions with Adaptive Estimation of the Lipschitz Constant
A Rodomanov
2016
The system can't perform the operation now. Try again later.
Articles 1–13