Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
EncCluster: Scalable functional encryption in federated learning through weight clustering and probabilistic filters
Eindhoven University of Technology, Netherlands.
RISE Research Institutes of Sweden, Digital Systems, Industrial Systems. Mälardalen University, Sweden.ORCID iD: 0000-0003-4725-0595
Mälardalen University, Sweden.ORCID iD: 0000-0002-4473-7763
Eindhoven University of Technology, Netherlands.
Show others and affiliations
2025 (English)In: Pervasive and Mobile Computing, ISSN 1574-1192, E-ISSN 1873-1589, Vol. 108, article id 102021Article in journal (Refereed) Published
Abstract [en]

Federated Learning (FL) enables model training across decentralized devices by communicating solely local model updates to an aggregation server. Although such limited data sharing makes FL more secure than centralized approached, FL remains vulnerable to inference attacks during model update transmissions. Existing secure aggregation approaches rely on differential privacy or cryptographic schemes like Functional Encryption (FE) to safeguard individual client data. However, such strategies can reduce performance or introduce unacceptable computational and communication overheads on clients running on edge devices with limited resources. In this work, we present EncCluster, a novel method that integrates model compression through weight clustering with recent decentralized FE and privacy-enhancing data encoding using probabilistic filters to deliver strong privacy guarantees in FL without affecting model performance or adding unnecessary burdens to clients. We performed a comprehensive evaluation, spanning various datasets and architectures, to demonstrate EncCluster scalability across encryption levels. Our findings reveal that EncCluster significantly reduces communication costs — below even conventional FedAvg — and accelerates encryption by more than four times over all baselines; at the same time, it maintains high model accuracy and enhanced privacy assurances. 

Place, publisher, year, edition, pages
Elsevier B.V. , 2025. Vol. 108, article id 102021
Keywords [en]
Clusterings; Data Sharing; Decentralised; Functional encryptions; Limited data; Local model; Model training; Model updates; Probabilistic filters; Weight clustering; Federated learning
National Category
Computer and Information Sciences
Identifiers
URN: urn:nbn:se:ri:diva-78392DOI: 10.1016/j.pmcj.2025.102021Scopus ID: 2-s2.0-85218642441OAI: oai:DiVA.org:ri-78392DiVA, id: diva2:1999224
Note

This work has been supported by the H2020 ECSEL EU project Distributed Artificial Intelligent System (DAIS), received fundingfrom the ECSEL JU under grant agreement No. 101007273, and the Knowledge Foundation within the framework of INDTECH (Grant No. 20200132) and INDTECH+ Research School project (Grant No. 20220132).

Available from: 2025-09-19 Created: 2025-09-19 Last updated: 2025-09-23Bibliographically approved

Open Access in DiVA

fulltext(1588 kB)43 downloads
File information
File name FULLTEXT01.pdfFile size 1588 kBChecksum SHA-512
054a36151933006f29deea79c67248201caca6ef1f0f23655f46c88b70b8de4a973f8d607e5b9a1cb0a27c66b121e3510119489b160e97f460cb304a1bac569e
Type fulltextMimetype application/pdf

Other links

Publisher's full textScopus

Authority records

Mohammadi, SamanehBalador, Ali

Search in DiVA

By author/editor
Mohammadi, SamanehBalador, Ali
By organisation
Industrial Systems
In the same journal
Pervasive and Mobile Computing
Computer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 43 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 2646 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf