• 中文核心期刊要目总览
  • 中国科技核心期刊
  • 中国科学引文数据库(CSCD)
  • 中国科技论文与引文数据库(CSTPCD)
  • 中国学术期刊文摘数据库(CSAD)
  • 中国学术期刊(网络版)(CNKI)
  • 中文科技期刊数据库
  • 万方数据知识服务平台
  • 中国超星期刊域出版平台
  • 国家科技学术期刊开放平台
  • 荷兰文摘与引文数据库(SCOPUS)
  • 日本科学技术振兴机构数据库(JST)

联邦异质性优化的在线区间估计

Online confidence interval estimation for federated heterogeneous optimization

  • 摘要: 从统计的角度来看,在联邦学习中进行统计推断来了解数据分布是至关重要的。由于本地迭代次数和本地数据集的异质性,传统的统计推断方法不适用于联邦学习。本文研究了如何构造联邦异质性优化问题的置信区间。我们引入了重调整联邦平均估计,并证明了估计的相合性。针对置信区间估计,我们证明了算法产生的参数估计的渐近正态性,并表明渐近协方差大小与终端参与率成反比。提出了一种基于重调整联邦平均的分离plug-in的在线置信区间估计方法。该方法可以在各个客户端的局部迭代次数不同的情况下在线构建有效的置信区间。由于客户端和本地数据集所存在差异,本地迭代次数的异质性是普遍存在的。因此,联邦异质性优化问题的置信区间估计具有重要意义。

     

    Abstract: From a statistical viewpoint, it is essential to perform statistical inference in federated learning to understand the underlying data distribution. Due to the heterogeneity in the number of local iterations and in local datasets, traditional statistical inference methods are not competent in federated learning. This paper studies how to construct confidence intervals for federated heterogeneous optimization problems. We introduce the rescaled federated averaging estimate and prove the consistency of the estimate. Focusing on confidence interval estimation, we establish the asymptotic normality of the parameter estimate produced by our algorithm and show that the asymptotic covariance is inversely proportional to the client participation rate. We propose an online confidence interval estimation method called separated plug-in via rescaled federated averaging. This method can construct valid confidence intervals online when the number of local iterations is different across clients. Since there are variations in clients and local datasets, the heterogeneity in the number of local iterations is common. Consequently, confidence interval estimation for federated heterogeneous optimization problems is of great significance.

     

/

返回文章
返回