Shaoduo Gan
Shaoduo Gan
Verified email at
Cited by
Cited by
Communication compression for decentralized training
H Tang, S Gan, C Zhang, T Zhang, J Liu
NeurIPS 2018, 2018
Towards Demystifying Serverless Machine Learning Training
J Jiang*, S Gan*, Y Liu, F Wang, G Alonso, A Klimovic, A Singla, W Wu, ...
SIGMOD 2021, 2021
1-bit Adam: Communication Efficient Large-Scale Training with Adam's Convergence Speed
H Tang, S Gan, AA Awan, S Rajbhandari, C Li, X Lian, J Liu, C Zhang, ...
ICML 2021, 2021
Ease. ML: A Lifecycle Management System for Machine Learning
L Aguilar Melgar, D Dao, S Gan, NM Gürel, N Hollenstein, J Jiang, ...
CIDR 2021, 2021
Bagua: Scaling up Distributed Learning with System Relaxations
S Gan, X Lian, R Wang, J Chang, C Liu, H Shi, S Zhang, X Li, T Sun, ...
VLDB 2022, 2022
Fruda: Framework for distributed adversarial domain adaptation
S Gan, A Mathur, A Isopoussu, F Kawsar, N Berthouze, ND Lane
IEEE Transactions on Parallel and Distributed Systems 33 (11), 3153-3164, 2021
In-Database Machine Learning with CorgiPile: Stochastic Gradient Descent without Full Data Shuffle
L Xu, S Qiu, B Yuan, J Jiang, C Renggli, S Gan, K Kara, G Li, J Liu, W Wu, ...
SIGMOD 2022, 2022
Distributed Asynchronous Domain Adaptation: Towards Making Domain Adaptation More Practical in Real-World Systems
S Gan, A Mathur, A Isopoussu, N Berthouze, ND Lane, F Kawsar
Workshop on Systems for ML at NeurIPS 2019, 2019
A systematic evaluation of machine learning on serverless infrastructure
J Jiang, S Gan, B Du, G Alonso, A Klimovic, A Singla, W Wu, S Wang, ...
The VLDB Journal, 1-25, 2023
The system can't perform the operation now. Try again later.
Articles 1–9