VGU RESEARCH REPOSITORY
Please use this identifier to cite or link to this item:
https://epub.vgu.edu.vn/handle/dlibvgu/1858| Title: | Stochastic updating in federated learning | Authors: | Tran Quang Minh | Keywords: | Federated learning;Non-IID data;FL framework | Issue Date: | 2024 | Abstract: | Federated learning (FL) offers a promising approach to train machine learning models collaboratively on decentralized data while preserving privacy [1]. However, the decentralized nature of data in FL presents challenges for traditional optimization algorithms due to non-IID data distributions and communication costs. This thesis investigates stochastic updating techniques for improving the efficiency of FL, particularly focusing on mitigating the impact of non-IID data and reducing communication overhead. We leverage NVFlare [2], a popular open-source FL framework, to implement and evaluate our proposed methods. This thesis first reviews existing research on stochastic gradient descent (SGD) variants for FL, addressing non-IID data, and communication-efficient techniques. We then introduce our proposed approach for stochastic updating in FL, which incorporates a novel mechanism utilizing dropout and weight averaging to achieve efficient optimization. The proposed method leverages the concept of dropout during training to introduce randomness to the model updates, mitigating the impact of non-IID data. Additionally, the method employs weight averaging to ensure convergence towards a global optimum |
URI(1): | https://epub.vgu.edu.vn/handle/dlibvgu/1858 | Rights: | Attribution-NonCommercial 4.0 International |
| Appears in Collections: | Computer Science (CS) |
Files in This Item:
| File | Description | Size | Format | Existing users please Login |
|---|---|---|---|---|
| Stochastic updating in federated learning.pdf | 7.25 MB | Adobe PDF |
Page view(s)
74
checked on Oct 16, 2025
Download(s)
42
checked on Oct 16, 2025
Google ScholarTM
Check
This item is licensed under a Creative Commons License