Chen-Yu Ho
Chen-Yu Ho
ByteDance Inc.
Dirección de correo verificada de - Página principal
Citado por
Citado por
Scaling Distributed Machine Learning with In-Network Aggregation
A Sapio, M Canini, CY Ho, J Nelson, P Kalnis, C Kim, A Krishnamurthy, ...
Proceedings of the 18th USENIX Symposium on Networked Systems Design and …, 2021
Natural compression for distributed deep learning
S Horvóth, CY Ho, L Horvath, AN Sahu, M Canini, P Richtárik
Mathematical and Scientific Machine Learning, 129-141, 2022
GRACE: A compressed communication framework for distributed machine learning
H Xu, CY Ho, AM Abdelmoniem, A Dutta, EH Bergou, K Karatsenidis, ...
2021 IEEE 41st international conference on distributed computing systems …, 2021
On the discrepancy between the theoretical analysis and practical implementations of compressed communication for distributed deep learning
A Dutta, EH Bergou, AM Abdelmoniem, CY Ho, AN Sahu, M Canini, ...
Proceedings of the AAAI Conference on Artificial Intelligence 34 (04), 3817-3824, 2020
Efficient sparse collective communication and its application to accelerate distributed deep learning
J Fei, CY Ho, AN Sahu, M Canini, A Sapio
Proceedings of the 2021 ACM SIGCOMM 2021 Conference, 676-691, 2021
A Comprehensive Empirical Study of Heterogeneity in Federated Learning
AM Abdelmoniem, CY Ho, P Papageorgiou, M Canini
IEEE Internet of Things Journal, 2023
Tackling the Communication Bottlenecks of Distributed Deep Learning Training Workloads
El sistema no puede realizar la operación en estos momentos. Inténtalo de nuevo más tarde.
Artículos 1–7