Tom Titcombe
Practical defences against model inversion attacks for split neural networks
Titcombe, Tom; Hall, Adam James; Papadopoulos, Pavlos; Romanini, Daniele
Authors
Abstract
We describe a threat model under which a split network-based federated learning system is susceptible to a model inversion attack by a malicious computational server. We demonstrate that the attack can be successfully performed with limited knowledge of the data distribution by the attacker. We propose a simple additive noise method to defend against model inversion, finding that the method can significantly reduce attack efficacy at an acceptable accuracy trade-off on MNIST. Furthermore, we show that NoPeekNN, an existing defensive method, protects different information from exposure, suggesting that a combined defence is necessary to fully protect private user data.
Presentation Conference Type | Conference Paper (unpublished) |
---|---|
Conference Name | ICLR 2021 Workshop on Distributed and Private Machine Learning (DPML 2021) |
Start Date | May 7, 2021 |
Publication Date | Apr 21, 2021 |
Deposit Date | Oct 31, 2022 |
Publicly Available Date | Nov 1, 2022 |
Public URL | http://researchrepository.napier.ac.uk/Output/2946016 |
Publisher URL | https://dp-ml.github.io/2021-workshop-ICLR/ |
Files
Practical Defences Against Model Inversion Attacks For Split Neural Networks
(517 Kb)
PDF
You might also like
Privacy-Preserving Passive DNS
(2020)
Journal Article
A Privacy-Preserving Healthcare Framework Using Hyperledger Fabric
(2020)
Journal Article
Launching Adversarial Attacks against Network Intrusion Detection Systems for IoT
(2021)
Journal Article
Browsers’ Private Mode: Is It What We Were Promised?
(2021)
Journal Article
Downloadable Citations
About Edinburgh Napier Research Repository
Administrator e-mail: repository@napier.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search