Context: Federated Learning (FL) has emerged as a promising, massively distributed way to train a joint deep model across numerous edge devices, ensuring user data privacy by retaining it on the device. In FL, Hyperparameters (HP) significantly affect the training overhead regarding computation and transmission time, computation and transmission load, as well as model accuracy. This paper presents a novel approach where Hyperparameters Optimization (HPO) is used to optimize the performance of the FL model for Speech Emotion Recognition (SER) application. To solve this problem, both Single-Objective Optimization (SOO) and Multi-Objective Optimization (MOO) models are developed and evaluated. The optimization model includes two objectives: accuracy and total execution time. Numerical results show that optimal Hyperparameters (HP) settings allow for improving both the accuracy of the model and its computation time. The proposed method assists FL system designers in finding optimal parameters setup, allowing them to carry out model design and development efficiently depending on their goals.
This work was supported by EU ECSEL project DAIS which has received funding from the ECSEL Joint Under-taking (JU) under grant agreement No.101007273.