Federated Learning is a machine learning approach where multiple devices or servers collaboratively train a shared model without exchanging their local data. Instead, each participant processes data locally and only shares model updates, such as gradients or parameters, with a central server. This method enhances data privacy and security, reduces data transfer needs, and allows organizations to leverage distributed data sources for improved model performance while maintaining user confidentiality.
Federated Learning is a machine learning approach where multiple devices or servers collaboratively train a shared model without exchanging their local data. Instead, each participant processes data locally and only shares model updates, such as gradients or parameters, with a central server. This method enhances data privacy and security, reduces data transfer needs, and allows organizations to leverage distributed data sources for improved model performance while maintaining user confidentiality.
What is Federated Learning?
A machine learning approach where multiple devices or servers train a shared model locally and only send model updates to a central server, not the raw data.
Why is Federated Learning considered privacy-preserving?
Data stays on devices; only processed updates (gradients or parameters) are shared, reducing exposure of sensitive information.
How is the global model updated in Federated Learning?
Each participant trains on local data and sends updates to a central server, which aggregates them to update the global model for the next round.
Where is Federated Learning commonly used and what are its challenges?
Used in mobile apps, edge computing, and organizations with data silos. Challenges include communication costs, device heterogeneity, and potential privacy leakage from updates.