Arto Maranjyan

prof_pic.jpg

I’m a PhD student at KAUST, advised by Prof. Peter Richtárik. My research focuses on optimization for machine learning (ML) and federated learning (FL), where I contribute to the development of distributed and randomized optimization algorithms. I’m currently focused on addressing system heterogeneity issues in distributed ML and FL, with an emphasis on asynchronous methods. My recent work includes:

  • Ringmaster ASGD: The first Asynchronous SGD method achieving optimal time complexity under heterogeneous and dynamic worker computation times, meeting theoretical lower bounds.

  • ATA: An Adaptive Task Allocation method that efficiently manages resources in distributed ML, adapting to heterogeneous worker speeds and performing optimally without prior knowledge.

  • MindFlayer: An efficient parallel SGD framework designed to handle heterogeneous and random worker computation times with heavy-tailed distributions.

Before starting my PhD, I earned my MSc and BSc from Yerevan State University. During my bachelor’s, I co-authored several papers in Harmonic Analysis under the guidance of Prof. Martin Grigoryan.

Outside of academics, I enjoy dancing bachata, playing board games, ultimate frisbee, and foosball.

Recent News

Feb 10, 2025 I’ll be giving a talk at the AMCS/STAT graduate seminar at KAUST on February 27, presenting our paper, Ringmaster ASGD: The First Asynchronous SGD with Optimal Time Complexity.
Feb 04, 2025 New paper out ATA: Adaptive Task Allocation for Efficient Resource Management in Distributed Machine Learning. Co-authored with El Mehdi Saad, Peter Richtárik, Francesco Orabona. [LinkedIn post]
Jan 28, 2025 New paper out: Ringmaster ASGD: The First Asynchronous SGD with Optimal Time Complexity. Co-authored with Alexander Tyurin, Peter Richtárik. [LinkedIn post]
Jan 23, 2025 Our paper LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression by Laurent Condat, Peter Richtárik, and me, has been accepted to ICLR 2025 as a Spotlight! [LinkedIn post]
Nov 23, 2024 I had the pleasure of giving a talk at the Apple MLR seminar, thanks to an invitation from Samy Bengio. It was an amazing opportunity to share our work on MindFlayer. Feel free to check out the talk slides here.
Oct 17, 2024 I will be giving a talk at the International Conference on Algebra, Logic, and their Applications on October 18, on our paper, MindFlayer: Efficient Asynchronous Parallel SGD in the Presence of Heterogeneous and Random Worker Compute Times.
Oct 17, 2024 I am reviewing for SIAM Journal on Mathematics of Data Science (SIMODS).

Selected Publications

  1. ATA: Adaptive Task Allocation for Efficient Resource Management in Distributed Machine Learning
    Artavazd Maranjyan, El Mehdi Saad, Peter Richtárik, and Francesco Orabona
    arXiv:2502.00775, 2025
  2. Ringmaster ASGD: The First Asynchronous SGD with Optimal Time Complexity
    Artavazd Maranjyan, Alexander Tyurin, and Peter Richtárik
    arXiv:2501.16168, 2025
  3. MindFlayer: Efficient Asynchronous Parallel SGD in the Presence of Heterogeneous and Random Worker Compute Times
    Artavazd Maranjyan, Omar Shaikh Omar, and Peter Richtárik
    OPT 2024: Optimization for Machine Learning (NeurIPS workshop), 2024
  4. LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression
    Laurent Condat, Artavazd Maranjyan, and Peter Richtárik
    In ICLR 2025: The Thirteenth International Conference on Learning Representations , 2025
  5. Differentially Private Random Block Coordinate Descent
    Artavazd Maranjyan, Abdurakhmon Sadiev, and Peter Richtárik
    OPT 2024: Optimization for Machine Learning (NeurIPS workshop), 2024
  6. GradSkip: Communication-accelerated local gradient methods with better computational complexity
    Artavazd Maranjyan, Mher Safaryan, and Peter Richtárik
    arXiv:2210.16402, 2022