161
Software as a Medical Device: Regulatory and Market Access Implications
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
into clinical use through one or more copies.
Each copy has a model identical to that of the
original device. The manufacturer or health
institution may continue to train the global
model. Through periodic or real-time updates,
the manufacturer or health institution synchro-
nizes the local and global models. Updates can
comprise changes to the settings or the design
of the device. The local models are identical to a
version of the global model.
The synchronization can occur in real-
time or with a delay, depending on the need
for validation and conformity assessment.
Conversely, during local change, the local
models learn and change independently of
the global model (see Figure 11-2).
Federated Learning
Global change is irrespective of whether the
global model learns based on local data or on
data collected separately, e.g., through the use of
data repositories (e.g., registries or decentralized
personal data stores) or data generated through
clinical investigations and postmarket clinical
follow-up (PMCF).
Federated learning (see Figure 11-3) is a
machine learning technique that trains an algo-
rithm across multiple decentralized edge devices
or servers holding local data samples without
exchanging them with the developer, e.g., in
health institutions. This approach contrasts with
traditional centralized machine learning tech-
niques, where all the local datasets are uploaded
to one server, and more classic decentralized
methods are used that often assume that local
data samples are identically distributed. The main
advantage of using federated approaches is to
ensure data privacy or data secrecy. Indeed, no
local data is uploaded externally, concatenated, or
exchanged. Since the entire database is seg-
mented into local bits, it is more difficult to hack
into it. With federated learning, only machine
learning parameters are exchanged. Also, such
parameters can be encrypted before sharing
between learning rounds to extend privacy.
Homomorphic encryption schemes can be used
to directly make computations on the encrypted
data without decrypting them beforehand.
Figure 11-2. Global and Local Model Change
model-server
local model a local model b local model c
model sync
Global Change
local models change
in synch with server model
Local Change
server model deployed locally,
then local models are trained
with their own data
model-server
local model a local model b local model c
© Koen Cobbaert 2021
Previous Page Next Page