175
Software as a Medical Device: Regulatory and Market Access Implications
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
eration Platform (AIMDICP), a subdivision of
CMDE, actively encourages the creation of eval-
uation databases and test platforms, starting with
high prevalent diseases, such as lung cancer and
diabetic retinopathy. As the world is a long way
from having databases covering all 55,000 diseases
and conditions listed in the 11th International
Classification of Diseases (ICD 11) published
by the World Health Organisation (WHO), this
Chinese initiative is a welcome start.
US AI Legislation
In the US, FDA published a discussion paper that
focuses on machine learning devices that change
during runtime, citing the Precertification (Pre-
Cert) Program63 as a possible regulatory pathway
for AI. The Pre-Cert Program is intended to be
a regulatory model that is more streamlined and
efficient, resulting in getting products to market
and to patients faster than existing 510(k), de novo,
and pre-market pathways.
The Pre-Cert Program involves focusing
on the product developer instead of focusing
primarily on the product itself. If the developer
can demonstrate a culture of quality, excellence,
and responsiveness, FDA believes that a stream-
lined approval process could be allowed. The
shift from a pure product focus to a product and
process viewpoint is a new pathway for FDA and
is a step towards convergence with the quality
management system approach used within the
EU. At the time of writing, FDA is piloting the
Pre-Cert Program as a regulatory sandbox.
The Role of AI Standards
Generally, legislation provides high-level require-
ments to which a product must comply, while a
standard provides requirements on how a product
must comply. Consequently, standards are more
prescriptive than legislation.
As standards are generally voluntary, they are
the ideal vehicle to experiment with regulatory
sandboxes. As countries develop AI legislation,
they generally try to avoid being too prescriptive
to avoid killing innovation. On the other hand,
they do not want to place the bar too low, which
could lead to unsafe uses of AI, and also could
create inadequate user trust, resulting in less
uptake of AI-products and causing harm to the
competitive position of a country. Finding the
sweet spot between not being too prescriptive
Figure 11-11. Overview of NMPA Regulation and Standardization Applicable to AI-Enabled
Medical Devices
Medical Software
Review Guideline
Medical Device
Cybersecurity Guideline
Mobile Medical Device
Guideline
Artificial Intelligence Medical Device Review Guideline
Key Review Points on
Deep Learning Clinical
Decision Support Systems
Key Review Points on
CT Pneumonia
Performance Index and Test
Method for CT Lung Nodule
Decision Support Product
AI Medical Device CT Lung
Image CAD Performance
Evaluation Method
AI Medical Device Quality
Requirements and Evaluation
Part 1: Terminology
AI Medical Device Quality
Requirement and Evaluation
Part 2: Data Set
AI Medical Device Fundus
Image Assist Analysis Software
Performance Evaluation Method
Development and Validation
of CT Pneumonia
Decision Support Software
AI Medical Device Quality
Requirement and Evaluation Part 3:
Data Annotation Requirements
AI Med. Dev. Software Production
Quality Management Part 1: Deep
Learning Algorithm Development
In Effect Proposal
In Effect In Effect
In Effect In Effect In Effect
Proposal Proposal
Proposal Proposal Draft Draft
Draft
S
T
A
N
D
A
R
D
S
R
E
G
U
L
A
T
I
O
N
S
Previous Page Next Page