fbpx
1-888-310-4540 (main) / 1-888-707-6150 (support) info@spkaa.com
Select Page

Computer System Validation: A Closer Look at 21 C.F.R. §820.70(i) and FDA Warning Letters

SPK and Associates assist many of our clients with Computer System Validation services. FDA investigators often site issues with respect to “intended use” and other aspects of 21 C.F.R. §820.70(i). In this article we will take a closer look at the specifics of this rule and ways to help your company avoid 483 observations and warning letters.

The FDA Code of Federal Regulations, Title 21 under Quality Systems, subsection 70(i) sites the following:

“When computers or automated data processing systems are used as part of production or the quality system, the manufacturer shall validate computer software for its intended use according to an established protocol. All software changes shall be validated before approval and issuance. These validation activities and results shall be documented.”

And FDA provides additional supportive information via Section 6 of “Validation of Automated Process Equipment and Quality System Software” in the Principles of Software Validation; Final Guidance for Industry and FDA Staff, January 11, 2002.

We begin unraveling 21 C.F.R. §820.70(i) by 1st figuring out “what needs to be validated”. This means creating a company-wide inventory of all your tools and assessing whether it performs quality or business critical functions. The FDA recommends utilizing a risk-based assessment process to determine whether validation is necessary. You should consider a system’s impact to product quality, safety and records integrity.

Once we have our list we must spend the time to develop meaningful user requirements. Section 6.2 spells it out concisely noting what we need to capture:

  • the “intended use” of the software or automated equipment; and
  • the extent to which the device manufacturer is dependent upon that software or equipment for production of a quality medical device.

The device manufacturer (user) needs to define the expected operating environment including any required hardware and software configurations, software versions, utilities, etc. The user also needs to:

  • document requirements for system performance, quality, error handling, startup, shutdown, security, etc.;
  • identify any safety related functions or features, such as sensors, alarms, interlocks, logical processing steps, or command sequences; and
  • define objective criteria for determining acceptable performance.

These user requirements are then typically decomposed into functional/design specifications.

The validation of these requirements/specifications becomes the elements of an overall validation strategy. The validation strategy also includes the roles of the people participating in the validation, the scope of the validation, the methodology, and further breaks down into distinct protocols (IQ – Installation Qualification, OQ – Operational Qualification, and PQ – Performance Qualification).

Each protocol contains a mapping of requirements to test cases. It is important that each requirement traces to a test case. An auditor may deem a system “has not been validated” if a requirement is discovered without a test case. You should double check via a Traceability Matrix, which shows all of your requirements, all of your testing planned (and performed), linked together. Your Traceability matrix is also a useful project management tool, to quickly determine completeness of a validation and to help determine coverage, all the way through from requirements to testing.

The components of the test case are equally important to get right for audit purposes. A test case should contain:

  • A Test Description/Goal
  • Test Steps for execution
  • Test Expected results
  • Test Actual results (Verbiage and Screen Shots provide good objective evidence).
  • Test pass/fail determination
  • Tester Signature and Date

21 C.F.R. §820.70(i) concludes by indicating that artifacts of the validation must be documented. And, that a configuration-management plan should be in place for all software.

Now let’s take a look at the warning letters within the last year which call out 820.70(i) non compliance. There have been 5 companies sited:

1) E. A. Beck & Co., 12/14/2010
“Failure to validate, for its intended use, computers or automated data processing systems used as part of production or the quality system, as required by 21 CFR 820.70(i).

The example used: “firm has not validated the software used for generating product labels.”

2) Advanced Surgical Design & Manufacture, Ltd., 12/1/2010
“Failure to validate computer software for its intended use according to an established protocol (when computers or automated processing systems are used as part of production or the quality system), as required by 21 CFR 820.70(i).”

The example used: “the program used to control (b)(4) milling machines during production of Meniscal Inserts has not been validated for its intended use according to an established protocol.”

3) Perma Pure LLC., 9/21/2010
“The (b)(4) Calibration Management software has not been validated as required by 21 CFR 820.70(i). This software is used to maintain equipment calibration records and calibration procedures. This same observation was made during the previous inspection of July 2006.”

4)  3CPM, Inc., 3/25/2010
“Failure to validate computer software for its intended use according to an established protocol when computers or automated data processing systems are used as part of production or the quality system, as required by 21 CFR 820.70(i). “

The example used: “when requested no validation documentation to support the commercial off-the-shelf program (b)(4) used to capture complaints, returned merchandise and service requests was provided.”

5) Olympus Terumo Biomaterials Corporation – Mishima Factory, 2/25/2010
“Failure to validate computer software for its intended use according to an established protocol when computers or automated data processing systems are used as part of production or the quality system, as required by 21 C.F.R. §820.70(i) (Production and Process Controls – Automated Processes).

The example used: “the CAPA analysis of nonconformances, which is used at management meetings, is inadequate in that the report is computer-generated on a non-validated software system.”

Hopefully this information will be helpful in avoiding 21 C.F.R. §820.70(i) based 483 observations and warning letters. We invite you to share your thoughts and experiences with respect to this area.

Subscribe to our blog to learn more about Computer System Validation and other SPK and Associates IT services.

Latest White Papers

The Hybrid-Remote Playbook

The Hybrid-Remote Playbook

Post-pandemic, many companies have shifted to a hybrid or fully remote work environment. Despite many companies having fully remote workers, many still rely on synchronous communication. Loom offers a way for employees to work on their own time, without as many...

Related Resources

How Model-Based Definition (MBD) Cuts ECOs by 41% and Scrap by 47%

How Model-Based Definition (MBD) Cuts ECOs by 41% and Scrap by 47%

Organizations are increasingly turning to Model-Based Definition (MBD) to revolutionize their engineering and manufacturing processes. By embedding rich, digital annotations directly into 3D models, MBD provides a single source of truth for product definitions. This...

Seamlessly Transition from AWS CodeCommit to GitLab

Seamlessly Transition from AWS CodeCommit to GitLab

In July of 2024, AWS announced that AWS CodeCommit would no longer be sold to new customers.  And thus begins the journey of winding down a product for AWS.  As AWS CodeCommit approaches its end-of-life, many organizations face a tough decision. Choosing where to...