Design Controls in Medical Device Development: What They Are & How to Run Them

Design-Controls-in-Medical-Device-Development-What-They-Are-and-How-to-Run-Them

Design controls in medical device development are the planned and documented practices that manage design from requirements through to verification/validation and controlled changes. They create objective evidence, often organised in a Design History File (DHF), showing that the device was designed to meet user needs, intended use, and safety requirements.

Quick Summary

If you’re building a medical device, design controls are how you prove—step by step—that your requirements, risks, design decisions, and test evidence all line up. They’re not “extra paperwork”; they’re the structure that helps teams catch gaps early, avoid expensive rework, and walk into audits with confidence.

In practical terms, design controls define:

  • what you’re building and why (design inputs)
  • what you built (design outputs)
  • how you checked it (verification)
  • how you proved it’s fit for intended use (validation)
  • how you controlled changes over time (design change control)

Below is a clear, implementable overview of design controls in medical device development, the evidence you typically need (DHF/traceability), and how digital document control can make compliance easier to run day-to-day.

What are design controls in medical device design?

Design controls are the planned and documented practices used to manage medical device design from concept through transfer and controlled changes. 

While exact expectations vary by jurisdiction and device type, the common thread is objective evidence that you:

  • planned the work
  • captured requirements (including safety/risk considerations)
  • reviewed progress at appropriate points
  • verified and validated the design
  • controlled design changes

That evidence trail is what turns “we did good engineering” into “we can demonstrate it.”

Ready to take control? Download our guide to  digital document control for medical device developers

Why design controls matter (beyond “because the regulator says so”)

Design controls protect patients and users by making safety and intended use explicit and testable. They also protect your project.

Without design controls, teams tend to build quickly toward a prototype, only to discover requirements gaps late; lose the rationale behind key decisions ( like “why did we choose this sensor/material?”); struggle to explain how risk controls were implemented and tested; and fail to maintain traceability when requirements or suppliers change.

With design controls, you get:

  • earlier detection of gaps (before expensive rework)
  • a consistent “definition of done” for each phase gate
  • faster onboarding (new engineers can follow the DHF/traceability)
  • audit readiness as a byproduct of doing the work, rather than a scramble later

This matters to quality leaders trying to avoid nonconformities and wasted audit cycles, and to engineering leaders who need speed without chaos (a common trigger for adopting better document control and governance).

What the regulations require

1. FDA: 21 CFR 820.30 (Design Controls)

21 CFR 820.30 outlines the major elements you’re expected to control and document, including planning, inputs, outputs, review, verification, validation, transfer, and changes. It also clarifies applicability: design controls apply to most Class II and III devices and only certain Class I devices (others may be exempt).

It is important to note that the FDA issued a final rule to amend Part 820 (QMSR) to align more closely with ISO 13485:2016, with an effective date of February 2, 2026. 

Under the FDA’s QMSR, design and development requirements are implemented via ISO 13485:2016 clause 7.3 (Design and development) incorporated by reference, so you still need the same objective evidence (plans, reviews, V&V, changes), but it’s mapped to the ISO 13485 structure alongside FDA-specific provisions.

2. ISO 13485:2016 (design & development)

ISO 13485:2016 includes design and development requirements (commonly referenced as clause 7.3), and is clear about what is expected from a design process:

“At suitable stages, systematic reviews of design and development shall be performed in accordance with planned and documented arrangements to: 

  1. evaluate the ability of the results of design and development to meet requirements;
  2. identify and propose necessary actions. Participants in such reviews shall include representatives of functions concerned with the design and development stage being reviewed, as well as other specialist personnel. Records of the results of the reviews and any necessary actions shall be maintained and include the identification of the design under review, the participants involved and the date of the review.”

Even if you’re “FDA-first,” aligning your design control procedures to ISO 13485 makes multi-market scaling easier, especially if you’re aiming for MDSAP or EU market access.

3. EU MDR / IVDR (technical documentation linkage)

In the EU, your technical documentation must be clear, organised, readily searchable, and include elements described in MDR Annexes II and III. For In Vitro Diagnostics, Regulation (EU) 2017/746 (IVDR) similarly requires technical documentation aligned to its annex structure.

In practice, your design control outputs (requirements, risk files, V&V evidence, change history) should be easy to reference when compiling (and maintaining) the Technical File.

The design control process: A step-by-step guide (with deliverables)

Step 1: Design and development planning

The first step is to define how you’ll run the project and what “done” means at each stage.

Typical outputs include:

  • Design and development plan (phases, responsibilities, deliverables, review cadence)
  • Definition of design inputs and how they’re approved/changed
  • Tooling plan (where records live; how approvals/e-signatures work; retention)

Good practices to keep in mind at this early stage are to use phase gates that align with real risk and complexity (not bureaucracy for its own sake), and to define the evidence required to pass each gate (e.g., updated traceability, risk review, verification plan).

Step 2: Design inputs (including usability and risk)

Here, it’s time to capture what the device must do, including safety and regulatory constraints, defined as requirements you can objectively test.

Input sources might include:

  • user needs, intended use, use environment
  • regulatory requirements and applicable standards
  • risk management outputs (hazards, hazardous situations, risk controls)
  • usability/human factors needs (use-related hazards)
  • software requirements constraints (if applicable)

Deliverables at this stage include user needs/User Requirement Specifications (or equivalent), system/product requirements (often referred to as “design inputs”), a risk management plan, and an early hazard analysis (even if it evolves), as well as the usability engineering plan (where applicable).

Step 3: Design outputs

The goal at this stage is to produce specifications and designs that implement the inputs specified and provided in step 2. It’s essentially putting down on paper what you’re building.

Examples of outputs might look like:

  • engineering drawings, CAD, BOM, material specs
  • architecture/design specs
  • software requirements specs and design (for SaMD/SiMD)
  • labelling and IFU drafts (as appropriate)
  • test methods and acceptance criteria

A good rule of thumb here is if you can’t verify it, it’s not specific enough. If a requirement doesn’t tell you what test you’d run and what a pass or fail looks like, the output is either (a) too vague, or (b) still a user need that must be translated into design inputs.

Step 4: Design review discussions (phase gates)

Design reviews are about systematically evaluating whether the design can meet requirements and agreeing on actions.

Starting with a meeting agenda and attendee list, and ensuring cross-functional team representation where necessary, reviews should capture design decisions, action items, owners, due dates, and updates on risk and traceability decisions.

To make reviews useful, tie each to the  traceability matrix:

  • What changed?
  • What risks moved?
  • What tests must be updated?

Step 5: Design verification (outputs meet inputs)

Design verification confirms that design outputs satisfy design inputs.

Deliverables at this phase include:

  • verification plan/protocols
  • test reports and objective evidence
  • nonconformances/deviations and resolutions (with impact assessment)
  • updated traceability (input → verification evidence)

Step 6: Design validation

The design validation step answers the question, “Does the finished device meet user needs and intended use in expected conditions?”

This is confirmed through:

  • validation plan/protocols (including usability validation where required)
  • clinical/performance evaluation evidence as appropriate to your device and claims
  • validation reports and conclusions
  • updated traceability (user needs → validation evidence)

Step 7: Design transfer (Device Master Record readiness)

Moving from “designed” to “buildable”, design transfer isn’t just document handover; it’s proving the design is manufacturable and repeatable. This involves releasing production specs, qualifying critical suppliers, setting incoming inspection criteria, and defining manufacturing and inspection processes. 

Key design transfer deliverables include:

  • manufacturing specs and acceptance criteria
  • process instructions and inspection/test methods
  • released labelling and packaging specs
  • Device Master Record (DMR) readiness (with jurisdiction-dependent terminology)

Where processes can’t be fully verified by later inspection (often called “special processes”), you’ll need process validation and documented acceptance criteria.

Step 8: Design changes (change control)

Control changes are needed so that you don’t invalidate prior evidence. For effective change control, the change request, rationale, and impacted documents/components must be supplied, along with an impact assessment that includes requirements, risks, suppliers, and other relevant details. Re-verification and re-validation decisions and results must be tracked, too, along with an updated DHF for traceability.

Regulatory impact matters here. For each change, assess whether it affects intended use, performance/safety claims, risk controls, or clinical/usability evidence and whether it triggers a new submission/notification, in addition to re-verification/re-validation.

The design controls documentation set (DHF, DMR, traceability matrix)

DHF (Design History File)

Your DHF is the “story of evidence” showing the design was developed according to your procedures. The FDA’s 21 CFR 820.30 explicitly links design controls to DHF expectations.

Here’s an example of a practical DHF structure:

Plans

Design plan, risk plan, usability plan, software plan (if applicable)

Inputs

User needs, design inputs/requirements, standards/regulatory requirements mapping

Outputs

Specs, drawings, architecture, software requirements/design

Reviews

Minutes & action closure evidence

Verification & Validation (V&V)

Plans, protocols, reports

Transfer

Manufacturing readiness evidence

Changes

Controlled change history & impact assessments

Traceability

Living matrix tying it all together

DMR (Device Master Record)

This is your “manufacturing recipe”, which includes the specs and procedures needed to build and inspect the device consistently. Design transfer serves as the bridge between DHF outputs and DMR content.

Traceability matrix (bidirectional, living)

Traceability is where many teams fall short, not because they don’t “have a matrix,” but because it stops being maintained once tests start.

Minimum viable traceability links could look like:

  • User need → Design input → Design output → Verification test
  • Risk control → Requirement/Output → Verification evidence
  • Use-related hazard → Usability requirement → Validation evidence

How an eQMS/document control system supports design controls

Design controls succeed or fail on one thing: whether evidence is created and stays connected as the project evolves. The good news is that there are many solutions available on the market designed to help you administer, track and log the kind of iterative design process outlined above. 

Digital tools that can help you gather and group documents together around a particular design phase, then seek feedback, changes, and approval from specified stakeholders. These tools can help you implement a digital document control process, ensuring that iterations of design documents aren’t released or further work started until every team has agreed on their content and direction.

What quality and engineering leaders typically need from an eQMS

Quality and engineering teams generally need an eQMS system that makes the “right way” the easiest way. That starts with controlled templates and configured workflows, ensuring that design control records are consistently created and routed to the right approvers without manual chasing. 

Just as important is secure approval handling, often including electronic signatures and audit trails, so teams can demonstrate who approved what, when, and under which version, especially where electronic records/signatures compliance applies. 

They also require robust traceability that remains intact when documents are revised: links between requirements, risks, design outputs, and test evidence should remain consistent across versions. 

Finally, strong change control is critical. Changes should trigger structured impact assessments across requirements, risk files, verification/validation, and suppliers, and the system should make it easy to find the current approved version of any document quickly. 

Conclusion

For medical device developers with a strong vision for a product and a passion for their solution, there is a strong temptation to ‘get stuck in’ and start building straight away. However, the regulatory environment requires you to take a more measured approach than this, mitigating the risk of failure or making dangerous errors by developing a systematic approach to your design process first.

When the above approach is followed, the correct elements are set in place, and the right digital tools implemented, “inspection readiness” becomes a natural byproduct of day-to-day work rather than a last-minute documentation scramble.Document control for medical device developers

FAQs

1. What’s the difference between design verification and design validation?

Verification asks: “Did we build it right?”; i.e., do the design outputs meet design inputs (specs, requirements). Validation asks: “Did we build the right thing?”; i.e., does the finished device meet user needs and intended use under expected conditions. A common mistake is treating validation like just another bench test; it often includes usability and real-world use scenarios.

2. What goes in a Design History File (DHF)?

A DHF is the organised record of how the design was developed, outlining the plan, inputs/requirements, outputs/specifications, review minutes and actions, verification and validation evidence, and records of design changes. Under FDA design controls, DHF is where an auditor looks to confirm you followed your procedures and can trace decisions to evidence.

3. DHF vs DMR: how are they different?

Think “history” vs “recipe.” DHF explains how the design evolved and was validated (through reviews, V&V, and changes). The Device Master Record (DMR) is what manufacturing needs to consistently build the device: drawings, specs, procedures, acceptance criteria, labelling, etc. Design transfer is where DHF outputs become DMR-ready content.

4. What are common design control audit findings?

Typical findings include: 

  • unclear or incomplete design inputs
  • missing or ineffective design reviews (no actions/closure)
  • weak traceability (can’t link requirements to tests)
  • verification that doesn’t map to inputs
  • validation that doesn’t reflect intended use/users
  • uncontrolled design changes (updates without impact assessment or re-test). 

These are typically process issues exacerbated by fragmented documentation.

Last updated on 3rd February 2026

Tags: Medical Device Development

Joe Byrne

Written by Joe Byrne

Joe Byrne is the CEO of Cognidox. With a career spanning medical device start-ups and fortune 500 companies, Joe has over 25 years of experience in the medical device and high-tech product development industries. With extensive experience in scaling businesses, process improvement, quality, medical devices and product development, Joe is a regular contributor to the Cognidox DMS Insights blog where he shares expertise on scaling and streamlining the entire product development cycle, empowering enterprises to achieve governance, compliance, and rigour.

Related Posts

Streamlining Medical Device Design Controls for FDA and ISO Compliance

Quick summary The FDA’s new QMSR replaces the legacy Quality System Regulation (QSR) and formally ...

What’s the best eQMS software for medical device developers in 2026?

There are many eQMS platforms out there that have been helping medical device developers bring ...