Workshops
Workshop on Network Optimization for Large AI Models
Workshop on Network Optimization for Large AI Models — Session 1
Network Bursts and Bottlenecks: Challenges of Distributed DNN Training Traffic
Jorg Liebeherr (University of Toronto)
Speaker Jorg Liebeherr (University of Toronto)
Communication-Efficient Online Distributed Optimization for Federated Learning
Ben Liang (University of Toronto)
Speaker Ben Liang (University of Toronto)
Optimizing Network Communications of Large-Scale AI Workloads Using Datacenter Multicast
Mohamed Hefeeda (Simon Fraser University)
Speaker Mohamed Hefeeda (Simon Fraser University)
Workshop on Network Optimization for Large AI Models — Session 2
Computer to Data Centers: Data and Computation Placement for Learning-Centric Applications
Jianping Pan (University of Victoria)
Speaker Jianping Pan (University of Victoria)
Panel: Network Optimization for Large AI Models
Moderator: Baochun Li (University of Toronto)
2. How has the rapid growth in AI model sizes impacted the design of large-scale distributed clusters? What specific challenges arise when scaling up clusters?
3. Can you share insights on the interplay between network optimization and other optimization techniques such as model parallelism, data parallelism, and pipelining in distributed AI training?
4. Looking ahead, what are the most pressing research directions of innovation in network optimization for large-scale AI clusters?
Speaker Moderator: Baochun Li (University of Toronto)
Gold Sponsor
Gold Sponsor
Student Travel Grants
Student Travel Grants
Student Travel Grants
Gold Sponsor
Gold Sponsor
Student Travel Grants
Student Travel Grants
Student Travel Grants
Made with in Toronto · Privacy Policy · INFOCOM 2020 · INFOCOM 2021 · INFOCOM 2022 · INFOCOM 2023 · © 2024 Duetone Corp.