Like the previous post about the personalized models, another NIPS workshop paper discussed a related topic, yet, from another perspective:
-
Federated Optimization: Distributed Optimization Beyond the Datacenter by Jakub Konečný, Brendan McMahan, Daniel Ramage
The authors introduces a new setting of the learning problem in which data are distributed across a very large number of computers, each having access only to few data points. This is primarily motivated by the setting, where users keep their data on their devices, but the goal is still to train a high quality global model.
Although it looks like very different from the paper described in the previous post, two pieces can be linked together for two points:
- It is important to train a high quality local or personalized model by utilizing the global model or vice versa, having a business management system is great.
- It is very important to understand the interplay of the global mode land the local model as well.
These work can raise interesting new directions, like how to serve/update models that are fully personalized on mobile devices.