基本信息来源于合作网站,原文需代理用户跳转至来源网站获取       
摘要:
The proliferation of massive datasets has led to significant interests in distributed algorithms for solving large-scale machine learning problems.However,the communication overhead is a major bottleneck that hampers the scalability of distributed machine learning systems.In this paper,we design two communication-efficient algorithms for distributed learning tasks.The first one is named EF-SIGNGD,in which we use the 1-bit (sign-based) gradient quantization method to save the communication bits.Moreover,the error feedback technique,i.e.,incorporating the error made by the compression operator into the next step,is employed for the convergence guarantee.The second algorithm is called LE-SIGNGD,in which we introduce a well-designed lazy gradient aggregation rule to EF-SIGNGD that can detect the gradients with small changes and reuse the outdated information.LE-SlGNGD saves communication costs both in transmitted bits and communication rounds.Furthermore,we show that LE-SIGNGD is convergent under some mild assumptions.The effectiveness of the two proposed algorithms is demonstrated through experiments on both real and synthetic data.
推荐文章
基于情感计算的E-Learning模型框架
情感计算
E-Learning
和谐人机环境
Blended Learning在教师教育中的应用探析
Blended Learning
教师教育
校本培训
反思
e-Learning用户心理体验量化评价研究
量化评价
用户心理体验
层次分析法
内容分析
关键词云
关键词热度
相关文献总数  
(/次)
(/年)
文献信息
篇名 SIGNGD with Error Feedback Meets Lazily Aggregated Technique:Communication-Efficient Algorithms for Distributed Learning
来源期刊 清华大学学报自然科学版(英文版) 学科
关键词
年,卷(期) 2022,(1) 所属期刊栏目 REGULAR ARTICLES
研究方向 页码范围 174-185
页数 12页 分类号
字数 语种 英文
DOI 10.26599/TST.2021.9010045
五维指标
传播情况
(/次)
(/年)
引文网络
引文网络
二级参考文献  (0)
共引文献  (0)
参考文献  (0)
节点文献
引证文献  (0)
同被引文献  (0)
二级引证文献  (0)
2022(0)
  • 参考文献(0)
  • 二级参考文献(0)
  • 引证文献(0)
  • 二级引证文献(0)
引文网络交叉学科
相关学者/机构
期刊影响力
清华大学学报自然科学版(英文版)
双月刊
1007-0214
11-3745/N
16开
北京市海淀区双清路学研大厦B座908
1996
eng
出版文献量(篇)
2269
总下载数(次)
0
  • 期刊分类
  • 期刊(年)
  • 期刊(期)
  • 期刊推荐
论文1v1指导