Federated Learning (FL) is an algorithm and solution that enables the training of ML models without the need to move large amounts of data to a central server. It works by sending copies of the model to the locations where data exists, and then performing learning at the edge. The updated model weights are then sent back to the central server, which aggregates the updates and improves the global model. FL was first applied to improve Google's Android keyboard word recommendation feature without uploading data to the cloud. OpenMined uses the PySyft framework and PyGrid Peer-to-Peer platform to enable FL on devices with limited computing power.
Top Highlights
Initially proposed in 2015, federated learning is an algorithmic solution that enables the training of ML models by sending copies of a model to the place where data resides and performing training at the edge, thereby eliminating the necessity to move large amounts of data to a central server for training purposes.
The data remains at its source devices, a.k.a. the clients, which receive a copy of the global model from the central server. This copy of the global model is trained locally with the data of each device. The model weights are updated via local training, and then the local copy is sent back to the central server.
One of the first applications of FL was to improve word recommendation in Google's Android keyboard without uploading the data, i.e. a user’s text, to the cloud.
The cost for implementing federated learning is higher than collecting the information and processing it centrally, especially during the early phases of R&D when the training method and process are still being iterated on.
Requires data owners to perform computations on the device that holds data - for some devices with limited computation capacity this may not be possible or economic.
Glasp is a social web highlighter that people can highlight and organize quotes and thoughts from the web, and access other like-minded people’s learning.