Medical Device Programming System Validation

The service engineering department of a leading medical device manufacturer developed a fully custom medical device programming system for distribution to its global service centers.  This new programming system was a down-scaled version of a larger and more complex automated system that the client used in production for the testing and programming of electronically driven motorized surgical devices.

The core of the new system was a software program designed to read data stored the device’s onboard programmable electronic control component, acquire real-time performance parameter values from the device, diagnose the state of “fitness” of the device by comparing the acquired data against a set of configuration specifications, and adjust the device software for optimum performance.  The programming software was designed to interface with the devices’ software installation files and configuration specifications files, stored in controlled network directories, via network connection.  At the end of the testing/programming sequence,  a printable device history report  was generated which includes all the recordable service actions and the device status at the time of service.

The new system also included a custom peripheral hardware fixture designed to mechanically interface with the devices at the battery/programming port, power the device, and provide logical connection to the onboard programmable electronic control component of the device.  The interface fixture was also equipped with a connection port for a multimeter to allow the service technician to gather electrical current readings from the device during testing.

Due to the universal design of the battery/programming port and the onboard programmable electronic control component, the programming system would accommodate the servicing of  a wide number of related device models, each with its own unique combination of device software application and configuration specifications.

Performance Validation professionals were called upon to complete the validation of this system while operating within the client’s new electronic validation system.

The PV Advantage

Performance Validation provided a dedicated team with Validation Specialists experienced in managing Computer System Validation projects.  Modeled after ISPE’s GAMP 5, a risk based approach was executed to maximize quality, efficiency and minimize cost.

Performance Validation professionals worked closely with the client’s engineering team to ensure that all risks with regard to the medical device programming system performance across the range of affected device products was taken into consideration and mitigated. This approach was  then duly documented in qualification.

The Solution

The project for the device programming system included:

Initial Assessment: an initial assessment was documented to establish the system’s GMP impact and applicability to 21 CFR Part 11 regulatory requirements.

User Requirements Specifications document (URS):  Given the URS document from the related production version of the device programming system (previously validated by the client) and engineering development reports for the new programming system, the PV CSV Validation Specialist was able to extract and derive the set of user requirements need to complete the URS.

Functional Design Specification document (FDS):  Given the FDS document from the production version of the device programming system and engineering development reports for the new programming system, the PV Computer System Validation Specialist extracted, derived and modified the applicable sub-set of functional specifications need to complete the FDS.

Measuring System Analysis (MSA):  MSA was performed and documented to qualify the use of a specified make/model of multimeter used in conjunction with the interface fixture to ensure the accuracy and reliability of the current readings.  MSA was also performed against the use of a specified make/model of photo tachometer to be used by field technicians to measure the speed of rotation/oscillation of the device for manual entry into the diagnostic data set.  MSA testing and documentation was developed and executed by a qualified PV Validation Engineer.

Computer System Validation Qualification (CSVQ):  

The CSVQ testing documentation was developed and executed by the PV Computer System Validation Specialist.  The client’s service engineering subject matter experts were consulted to ensure that their knowledge and experiences in testing and programming of the devices were considered to ensure reasonable mitigation of any known risks of failure in the testing and programming process.

The CSVQ included Installation Qualification (IQ) that verified:

  • the controlled state of system related installation instructions and service manual documents
  • the calibration of test instrumentation (multi-meter, tachometer)
  • the test installation of the interface fixture hardware
  • the test installation of the software system
  • the controlled storage of the software backup files

The CSVQ included Operational Qualification (OQ) that qualified:

  • the configuration of the testing configuration source data files to be logically interfaced by the system software during runtime (configuration values compared against approved product engineering design documentation)
  • the functionality of all graphical user interface screens and components
  • the fully automated software sequencing for testing and programming, with separate test cases designed to address specific device types under best case conditions
  • the pass/fail and remediation logic associated with diagnostic, programming, calibration, and optimization
  • the reliability of the software system to perform consistently over multiple test instances for multiple device models
  • the application of the software system functionality to all device models for which the system was intended
  • the specified design of the report and the accuracy of the data represented within the report
  • Traceability of the CSVQ testing to its related User Requirements was established within  the client’s new electronic validation system.

Performance Validation provided the necessary services and solutions to complete the programming system validation project, while remaining flexible and responsive to the customer’s schedule and budgetary constraints.  The validation project was completed successfully, and the new medical device programming system  was placed back production in a timely manner to the customer’s satisfaction.

The Benefits

The advantages of tailoring each validation effort based on system risks and complexity were realized.  Through complete and quality-driven validation planning, testing was minimized and remained focused on the system’s intended use and all critical quality attributes.  This ensured that timelines were met, while assuring the client that their programming system could be distributed to their service centers with full confidence.

For more information please contact:

Kevin Marcial,
CSV Services Manager
Performance Validation, LLC.
5168 Sprinkle Road
Portage, MI 49002
(269) 267-4020 Mobile

Data Integrity – Audit Trail Review

Data integrity demands a great amount of attention in the life science industry. This is now truer than ever with increased focus from the FDA, EU, and industry standards on data integrity issues and best practices. Audit trails for computerized systems are required for all FDA / EU regulated systems. It must capture the creation, modification, and deletion of regulated electronic records. Who created and when the record must be captured, as well as who, when and why the record was modified or deleted – as relevant. When a system is validated, an audit trail should be verified to ensure accuracy and that it meets all applicable regulatory and organizational requirements. Once the system is validated and in production, the audit trail should not be forgotten. A formal process to examine the audit trail to ensure data integrity is needed in the regulated environment. Let us consider audit trail review – how to approach it and a few items of interest.

Audit trail review refers to the process of periodically examining an audit trail based on a variety of factors. It is valuable to define audit trail review based on system risk. ISPE – as recommended by ISPE in the Records and Data Integrity Guide. Place a risk level on a system just as one would for any other computerized system risk assessment using criteria such as impacts to patient safety, drug/product efficacy, quality system, business risks, complexity/criticality etc. How often and to what degree the audit trail review occurs can then be assigned. Do include all system stakeholders in the criteria and assessment process including IT, QA, and business process owners.

It is important to develop procedures and processes for audit trail review or incorporate them into a Validation Master Plan and/or Quality Management System. The review itself might only be a spot check for a very low risk system or it could be a comprehensive analysis and tracing of data and metadata. Metadata is one aspect that should not be overlooked. The audit trail review cannot be adequate (in most cases) if information that makes the data meaningful (metadata) is not available. This is a time when putting on an investigator or QA “hat” is imperative. Audit trail review should (again, based on risk level) look with scrutiny at reruns and fails of data capture and modification. Procedurally and scientifically, it may be acceptable for rerunning and failing instrument runs, for example. However, does the audit trail capture these events? If so, how and is it complete? Again, risk is key, but these are questions and answers that are important. This is also an opportune time to review training records, access controls, and general system security – as applicable.

Audit trail review is an essential component to data integrity for any computerized system. There are guidelines and industry best practices out now which are very helpful in developing a process to manage the reviews. Yet it is important to understand the system’s risk and criticality so as to approach the assessment process efficiently. Use the audit trail review to put the pieces of data capture, modification, and deletion together – using metadata to give scale and meaning to the data and information. An audit trail review may be easy to overlook or curtail, but its contribution to overall data integrity and thus patient safety is very significant.

Software as a Medical Device (SaMD): Clinical Evaluation and Validation

So you have a Fitbit or activity tracker? How is it going – is it helping motivate you to move more, perhaps monitoring your heart rate? That is great – but is your doctor using the data to make decisions about your health? Probably not. However, this is the kind of question we begin to ask when considering Software as a Medical Device (SaMD). By definition, these are applications (software) intended for medical use without the use of hardware (International Medical Device Regulators Forum (IMDRF) . They are not software fixed in a medical device. “Medical use” is defined as providing “diagnosis, prevention, monitoring, treatment, or alleviation of disease/injury, supporting or sustaining life” (IMDRF guidance). SaMD can include software that might interface with hardware and/or medical devices and mobile apps that meet the definition as well. This answer, while fairly specific, can leave room for interpretation. Clinical Evaluation and categorization of the SaMD can facilitate a risk-based approach to validation.

The various guidance documents produced by the IMDRF are not intended to supersede the pre and post market regulatory requirements set forth for any medical device. SaMD is a medical device. Yet, the process of Clinical Evaluation may help drive validation requirements and a risk-based approach. SaMD Clinical Evaluation is the process to assess “the analytical validity (the SaMD’s output is accurate for a given input), and where appropriate, the scientific validity (the SaMD’s output is associated to the intended clinical condition/physiological state), and clinical performance (the SaMD’s output yields a clinically meaningful association to the target use of the SaMD) of the SaMD.” IMDRF guidance). Each validity check has a specific aim and outcome. Validation is part of the process of creating confirmation of analytical validity. In practice, the already required process of computer system validation can feed into the Clinical Evaluation. IMDRF guidance even states:  “analytical validity evidence of a SaMD is generated during the verification and validation activities in a manufacturer’s quality management system process and is always expected for a SaMD.” The IMDRF offers a risk categorization framework document. They define four categories. These are based on the “state of health care situation or condition (critical, serious, non-serious)” and the “significance of the information provided by SaMD to healthcare division (treat or diagnose, drive clinical management, and inform clinical management)” (IMDRF guidance). It is advisable to use this categorization to drive the risk-based process for verification and validation, particularly for the rigor of evidence gathering. The GAMP 5 process to risk-based computer system validation is likely to be of use here.

Validation of Software as a Medical Device is essentially no different than the validation of any computerized system. It is best to think of it in this way. However, one cannot simply put up “blinders” and ignore the singularity of a SaMD when assessing risk and right-sizing one’s validation effort.

2017 Society of Quality Assurance Annual Meeting

Are you planning to attend the 2017 Society of Quality Assurance Annual Meeting March 26-31, in National Harbor, Maryland?

If YES, please plan to stop by Booth 315 to meet Kevin Marcial of the Performance Validation team.

Performance Validation is a Value Added Reseller of the Adaptive GRC solution. Adaptive GRC Solution, which is a cost-effective approach to Governance, Regulation, and Compliance.  Adaptive GRC offers a suite of flexible, FDA compliant, and cloud‐based software suite to manage audit, risk, compliance, and quality activities. The solution can be implemented enterprise‐wide out of the box or configured for your specific requirements.

Adaptive GRC Key Capabilities:

Vendor Risk Management, IT/Information Security & Risk Compliance Oversight, Quality (CAPA) & Deviation Management, Enterprise Risk Management, Document Management, and Audit Management.

AdapativeGRC was originally built from experience in the Life Sciences sector. It has full Part 11 audit trail and electronic signature capabilities. It also has a baseline set of IT controls to allow more rapid use and deployment. Using AdapativeGRC can help to get gaps identified and analyzed with less effort. No local installation is required (operates over a standard web browser). You can get access to a full eGRC system for a much lower cost than was previously possible.

Adaptive GRC Demo Video

Governance, Risk, and Compliance (GRC) Basic Concepts

Governance, risk, and compliance or GRC is a term one in the pharma or biotech world might not hear all that often. It is a concept most often employed in financial, legal, and information technology divisions. “Governance” refers to the processes/procedures/activities used to manage the organization – such as management processes. This includes the GRC process itself. “Risk” refers to the assessment and mitigation (or management) of risks to the organization. This may be from a business and/or a compliance perspective – for example. Lastly, “compliance” applies to how the organization achieves adherence to internal (SOPs) and external requirements (regulatory bodies and authorities). GRC is comparable to the Quality Management System (QMS) concept found in pharma and medical device. The strength of the process comes from not only assessing/identifying/mitigating/controlling GRC elements but understanding how each relates to one another.

A quality GRC process is well-integrated into the business processes. Data collected from the various arms of GRC needs to garner information that can show trends and concerns to allow for mitigations and preventative actions to be timely and effective. This means that data collection must be accurate and timely. A software tool is useful for this. There is great value in forecasting risks based on compliance or governance activities. An interconnected GRC solution allows for visualizing data to understand how two seemingly disconnected activities impact each other – for example.

GRC Software as a Solution

A software solution can certainly aid in managing GRC activities, but GRC isn’t as simple as buying a software. In fact, it is important to define an organization’s GRC needs as an act outside of the consideration of software. Too often, software is thrown at a problem as a solution. The reality is that the business processes, such as GRC, are the root cause of the problem. Implementing a software won’t fix a bad process (not likely at least). To create a well-oiled process, start with mapping the business needs. One can use a tool like six sigma and/or a kaizen exercise to ascertain core activities and look for inefficiencies or faults. A nice mind mapping tool, like Xmind can be useful to aid in the process. Once the process has been well-designed and achieves the necessary compliance and business objectives, a software can be a nice tool to automate that process. A GRC software suite can automate audit and risk assessment processes, for example. True value can be realized though when analytics and dashboarding is utilized for business intelligence. Understanding how aspects of GRC relate and impact each other, as mentioned above, is fundamental in obtaining meaning from the tools (such as software) in place.

Project Summary – SCADA Validation for Medical Device Assembly

A leading medical device manufacturer sought to validate a new SCADA (Supervisory Control and Data Acquisition) system to be used in support of the assembly processes for both an upgraded product line and a new product line. This project launched the client’s initiative to leverage electronic process control technologies to improve efficiency and quality control of their manual assembly operations. It also represented their first step toward establishing electronic process records to be included in a long-term data storage solution for device history records.

The system leveraged SCADA technology for:

  • security, user permissions, and electronic signatures
  • acquisition of work order and serial number data via ERP interface
  • printout of barcode and information labels for sub-assemblies and finished product
  • establishment of parent/child relationships for sub-assembly and final assembly serial numbers
  • management of sub-assembly inventories
  • tracking of sub-assembly and main assembly serial numbers throughout the process
  • enforcement and real-time monitoring of build order sequencing
  • display of operator instructions and error messaging
  • determination of calibration status for inspection instrumentation and expiration status for sub-assemblies
  • control of activation status and operating parameters for interfacing equipment assets (i.e. hand tools, fixtures, automated presses, laser markers, conveyances, etc.)
  • acquisition of specified device history data generated in execution of the build order
  • supervisory functions for handling nonconformities and intervention
  • generation audit trails records for changes to build order configuration
  • guidance and recording of operator pick parts selection

The applied SCADA system features included:

  • GE Proficy HMI and iFix SCADA software applications
  • Microsoft SQL Server
  • Citrix Server/Receiver
  • Multiple workbench dedicated PC clients and peripheral installations
  • Integrated PLC, IO, and Ethernet communication with process hardware systems

Performance Validation professionals were called upon to validate this system to ensure quality and compliance with the Quality System Regulations and the client’s procedures and practices.

The PV Advantage

Performance Validation provided a dedicated Validation Specialist experienced in the delivery and execution validation documentation for SCADA projects.  Working closely with the client’s engineering validation team and quality assurance personnel, the PV Validation Specialist followed their established validation plan and user requirements to deliver the necessary system documentation and testing.

The Solution

The Validation Strategy included:

  • A risk based approach was taken by classification of the application as a commercial-off-the-shelf (COTS) software.  A failure mode and effects analysis was completed to assess the level of testing to be conducted to ensure that any identified risk related to configuration and custom coded functions was mitigated.
  • Testing requirements for the SCADA system were developed based on the findings and recommendations of the System Risk Assessment document and input from the client’s subject matter experts.
  • Due to the high level of integration, qualification/validation testing was completed in the production environment and incorporated the functionality of interfacing production hardware and software assets.
  • Installation Qualification (IQ) verified the installation of the:
    • Database Server, Application Server, and Terminal Server platforms and software installations
    • GE iFix application server installation
    • SQL Server database installations and configurations
    • network hardware installations (PLCs, I/O modules, switches)
    • hardware configuration and software installations for 20+ workbench dedicated thin-client workstations
    • logical continuity (assets<>IO/PLC<>database<>HMI)
    • software storage and version control
  • Operational Qualifications (OQ) verified the functionality of the:
    • logical security and electronic signatures
    • permission-based HMI screen navigation
    • fully integrated end-to-end build order executions with process enforcements
    • negative testing and error handling
    • supervisor intervention functions and related accountability
    • system audit trails and process data logging for electronic device history records
  • Validation deliverables included:
    • Validation Plan (revision)
    • System Risk Assessment (integrated test planning)
    • User Requirements Specifications (revision) / Requirements Trace Matrix
    • Functional Design Specifications
    • Code Review Reports
    • development and execution of Test Scripts for: Installation Qualification for core SCADA components, Installation Qualifications specific to each (2) production lines, Operational Qualifications specific to each (2) production lines
    • Production Support Plan
    • Final Validation Reports

The Results

Performance Validation delivered System Risk Assessment and Functional Design Specifications documents that facilitated a clear approach to testing and traceability for the path forward into test script design.

The client was provided with a thorough high-quality qualification testing package that ensured ready traceability to the approved requirements and specifications.

Throughout the validation, Performance Validation positioned themselves to be flexible and responsive to allow the client to optimize their project schedules for successful timing toward meeting pre-production deadlines.

Final Validation Reports were delivered on time and the SCADA system applications were released for production as scheduled.

The Benefits

Performance Validation built upon knowledge and experience gained in serving SCADA system validation projects in the pharmaceutical industry. The application of SCADA technology for medical device assembly differed somewhat from that of pharmaceutical formulation, filling, and packaging, but Performance Validation studied the details of the manufacturing process and was able to able to design a validation package that satisfied the client’s compliance needs.

Performance Validation endeavored to not only meet a minimum set of compliance requirements, but to provide the client a high level of assurance that all aspects of their new SCADA system would function as intended. Careful attention was given to ensuring the resulting life-cycle documentation would be readily maintainable for the life of the system.  Test documentation was designed in a way that provided the client with format and content that with minimum modification could be easily repeated to maintain the validated state of the system under future change control. This medical device manufacturer now has a validated SCADA system which meets their high standards of quality and compliance.

Kevin Marcial,
CSV Services Manager
Performance Validation, LLC.
5168 Sprinkle Road
Portage, MI 49002
(269) 267-4020 Mobile

Ultrasound Validation Project Summary

The Challenge:

A contract research organization (CRO) contracted Performance Validation (PV) to create a Computer System Validation (CSV) package for an ultrasound instrument.  They use the system during surgery to perform analysis flow, stress, wall thickness and detect blockage.  The instrument and its data acquisition/analysis software was purchased to replace an aging ultrasound system that was no longer being supported.  The previous system also saved data to an old tape recording system that would deteriorate and was becoming difficult to buy replacements for.  The lab conducts Good Laboratory Practices (GLP) compliant studies for their clients. To bring the system into GLP compliance, the system required validation. The high demand for surgical studies paired with a short workforce supply contributed to them contracting PV to validate the system in a timely manner to avoid down time.

The following computer system validation project considerations were identified for the client and their commercial-off-the-shelf (COTS) instrument/software:

  • Bringing the system into GLP compliance on an expedited timeline without significant in-house validation procedures or plans implemented
  • Evaluation of 21 CFR Part 11 compliance of the software or the laboratory procedures had not occurred
  • Qualification of the system’s configuration settings to ensure consistent and reproducible results
  • Ensuring the data appropriately stored to a validated cloud storage system for use in study reports
  • Minimizing impact on the stakeholder’s production responsibilities

The PV Advantage:

The PV team on this project had years of experience in GLP laboratory compliance including working on a comparable Ultrasound instrument which contributed to an expedient and comprehensive validation. PV immediately began work prior to being on site to expedite the project launch.  This was followed by onsite support and collaboration with laboratory stakeholders to complete the computer system validation.  The following components detail the PV solution:

  • PV team member was placed on an expedited training regime to become familiar with the system and the validation master plan in place at the facility.
  • Assistance reading/reviewing the SOP in place for the Ultrasound was provided and helped to determine the functionality being used by the client. This collaboration with the laboratory stakeholders helped ensure the timeline remained on track as well as satisfied the applicable regulations.
  • A User Requirements Specification was created by pulling information from the user manual for the software and hardware page-by-page. Then, the requirements were further developed and refined through partnering with the laboratory project stakeholders.  The team focused on the intended use and GLP applicability of the system.  This proved valuable as test script authoring commenced, the stakeholders could either assist in writing tests or delete requirements.
  • Creation of a 21 CFR Part 11 compliance assessment.  PV understood the regulations, the FDA Guidance, and the preamble requirements.  This assessment outlined areas of the software that needed to be addressed with a procedure (i.e. gaps) and was also used to create user requirements. A summary of how the gaps were addressed was also generated to include with the validation package.
  • A Validation Plan was created to outline the planned validation deliverables and the strategy for testing the system. It leveraged the favorable vendor audit and low risk score from ISPE’s GAMP 5 risk and complexity approach evaluation allowed the team to use a risk based approach to maximize efficiency and ensure compliance. This document
  • PV sent a team member to the clients’ site to work on documents and create test scripts. These test scripts were generated as early as possible to give the tester time to practice execution and allow for quality reviews prior to official execution.
  • The PV team member executed some testing and provided guidance and review for the remaining user acceptance scripts to ensure expedient execution.
  • A Requirements Traceability Matrix was created to document where each user requirement was satisfied with testing or vendor documentation.
  • Documentation was created using the clients’ approved forms and comparable formatting preference where applicable.
  • Concerning issues and documentation were shown to the Quality Assurance department to ensure they were appropriately addressed as early as possible to avoid further remediation.
  • PV provided regular project status updates which provided details on the project deliverables, project budget, and risks to the project and schedule.
  • A Validation Summary Report was generated explaining all the validation deliverables and how the clients’ documentation satisfied regulatory requirements.
  • PV provided the client with a “Lessons learned” document to help with future work there and provide insight they specifically asked for after completion of the validation.

The Results:

The lab instrument validation was completed ahead of schedule meeting the customer’s overall system implementation deadlines.  There were a list of observations issued and PV returned to the facility to address and expand testing ensuring all parties involved were satisfied. The validation effort brought the system into compliance with all applicable regulations through excellent documentation standards, testing processes, and a focus on the customer needs.  Business process and regulatory requirements were equally emphasized and constantly considered.

The Benefits:

Performance Validation provided the client computer system validation project execution and consultation through understanding the project, the stakeholders and their needs, and the best fitting validation strategy for a laboratory instrument.  The validation documentation focused on efficiency due to the time constraint and worked with the Information Technology, Quality Assurance, Subject Matter Expert, and Application owner to ensure responsibilities for each document were clearly outlined at every step in the project.  Performance Validation will continue to support the client beyond the system go-live in the event of auditing questions or change controls.


CFR – Code of Federal Regulations Title 21:

Guidance for the Industry Part 11, Electronic Records; Electronic Signatures – Scope and Application:

Kevin Marcial,
CSV Services Manager
Performance Validation, LLC.
5168 Sprinkle Road
Portage, MI 49002
(269) 267-4020 Mobile

Testing and Risk-Based Computer System Validation

Performance Validation recently featured an introductory post on risk-based computer system validation. It is an approach by which one can focus the validation effort on critical business and regulatory requirements and reduce the need for excessive testing and redundancy. A fundamental aspect of this approach is to leverage software vendor functional testing. This permits the validation effort to forego most functional (OQ) testing and hone in on the user acceptance and/or PQ testing. We mentioned that as a reference, one may look at ISPE’s GAMP 5: A Risk-Based Approach to Compliant GxP Computerized Systems for more information. Yet, most guidance is not so specific that it instructs how to scale the validation for a risk-based approach. There a couple practical considerations and ways to do this.

We stated that the approach to a well-executed computer system validation project starts with planning. A formal planning or strategy deliverable, such as a Validation Plan, should foster the process of determining what is and is not in scope of the validation project. Also of importance, one must provide justification of the scope. A validation plan can say “we are not testing “X””, but clearly it is important to document why a function is not tested. In most cases, the reason for not testing will fall into three categories:  the software functionality has been tested by the software vendor, it is not regulatory critical, and/or it is not business critical.

For example, let’s use an eQMS (electronic quality management system) as an example. An organization has conducted an audit of the software vendor and determined that the vendor has a robust QMS and SDLC (software development lifecycle), procedures (and they follow them), and documented evidence of functional testing. According to industry standards (such as GAMP 5), that organization can forgo functional testing and leverage the vendor’s testing. Depending on organizational requirements, you may still want to document the functional requirements and trace them to the vendor’s testing. Nonetheless, the activity of leveraging vendor testing is a valuable tool in executing a risk based approach.

Using the eQMS example, imagine a function that is off the shelf, but an organization does not use. In general, that function does not need to be validated. One still may want to include it in the requirements if, for example, it may be used in the future. Either way, acknowledgment of its presence and that it is out of scope is advisable so that it does not look like an oversight to an auditor. This scenario can be applied to business and regulatory requirements. If for example a function exists,but no regulatory or business requirements surround it, then one may be able to mark it as out of scope (no testing required). Testing efforts can also be scaled based on criticality and regulatory applicability. The decision process on how and what to test needs to be a collaboration between quality assurance, the business users, and any Information Technology stakeholders / subject matter experts. For example, testing a function to its full potential may require multiple scenarios. However, this may be a low business and regulatory function. As such, the project team may decide to test only one scenario; perhaps the most likely (PQ).

A risk-based approach to computer system validation is a great way to streamline a validation effort while maintaining quality. It is essential that the validation plan clearly documents what “is” and “is not” subject to the risk-based portion of the approach. As always, the “why” is most important. Understanding and being able to defend why something is not being tested (or scaled) gives credibility to the approach.

Risk Based Computer System Validation – A Primer

Risk-based computer system validation is a term widely used in our industry now, but understanding and implementing it can be challenging. Often, organizations want “cheaper, faster, better” but when the details of a risk-based computer system validation (CSV) plan are defined, they may find that their expectations have not been met. There are natural concerns with risk-based CSV such as loss of project quality and data integrity. Yet, there are straightforward ways to approach a validation plan to ensure that patient safety and product quality are being met in a risk-based computer system validation project

First, there are a couple of excellent resources to aid in the process of a risk-based computer system validation. The first is ISPE’s GAMP 5: A Risk-Based Approach to Compliant GxP Computerized Systems. This is the seminal guideline on how to execute risk-based CSV. It is also one of the premier industry standards on CSV in general. One can follow the strategy and deliverable guidance outlined in the document to develop and execute a compliant and efficient project. For general CSV and some risk-based guidance, I also recommend the Drug Information Association (DIA) Computerized Data Systems in Nonclinical Safety Assessment-Current Concepts and Quality Assurance. This document complements GAMP 5 in many ways and offers another viewpoint on computer system compliance and quality. Of note – see our recent blog post on the FDA-propose update to the GLPs for nonclinical studies if you work in that industry and are considering computer system compliance.

The approach to a well-executed computer system validation project, as in any project, starts in planning. A formal planning deliverable, such as the Validation Plan, Validation Strategy, or Testing Strategy will drive the project’s scope, strategy, and documentation activities. These plans can include risks assessment that will refine the focus, approach, and criteria for success of the validation project. One must do more than claim a risk-based approach is being used. There should be a documented assessment to demonstrate due diligence when leveraging risk in a project.

There are many ways to develop a system risk assessment for the purpose of a risk-based validation. In general, one examines the business and regulatory risks involved with the system and identifies any areas that are more and less critical. For example, at the system level an application that collects data for study or production work is considered both business critical and subject to regulatory requirements. A web application that serves as a dashboard to the data collection system, but does not allow for data manipulation / transformation, might be looked at as less critical. The categories and criteria for assessment should be as objective as possible. Some suggested elements include:

  • GAMP category (off the shelf, configured, customized – see GAMP 5 for more information on categorization)
  • Vendor status (driven by a vendor audit)
  • System functionality and complexity (what does it do? This is an important driver)
  • Regulatory applicability and criticality (another major aspect of the assessment)
  • The novelty of the system to the organization (such as in-house experience and adherence to infrastructure requirements / SOPs)
  • The system’s use in the given industry (e.g. is it widely used in the CGMP world?)
  • Business criticality (e.g. an organization may deem a system critical even though it is not subject to GxPs).

I recommend looking at RAMP (Risk Assessment and Management Process): An Approach to Risk-Based Computer System Validation and Part 11 Compliance (by Richard M. Siconolfi and Suzanne Bishop, 2007). Much of what I have laid out here is recommended in this approach. It is important to identify the stakeholders that should be involved in the assessment and to ensure they are included in the decision-making process.

Risk-based computer system validation can be done in a clean, clear-cut way that clearly documents the rationale for the project’s approach to validation and defines its objectives. The significant motivator behind a risk-based approach to computer system validation is clearly the justification for focusing testing efforts on the most business and regulatory critical aspects of the system. I will address how to actually do this in the next blog post.

Part 11 Compliance Considerations for SCADA Systems

Supervisory Control and Data Acquisition (SCADA) systems are tremendous assets in the pharmaceutical, medical device, and other FDA regulated manufacturing industries, providing cost efficiencies and improving the consistency of product quality. They have also enabled the industries to move from hardcopy to electronic production records. Systems used for manufacturing pharmaceuticals, medical devices, and other regulated health care products must comply with current Good Manufacturing Practices (cGMP). Production related electronic records are subject to compliance with the same predicate rules (e.g. 21 CFR Part 211 and 21 CFR Part 820) that would apply under paper-based quality systems. In addition, the systems that produce and retain these records are required to comply with 21 CFR Part 11 rules for electronic records and signatures.

SCADA system records and functions that are subject to 21 CFR Part 11:

• User access security restriction
• Electronic signatures
• Graphical user interface (GUI) displays, operator entries, and controls
• Recipe creation, editing, and version control
• Recipe sequence enforcement
• Electronic logging of recipe procedures executed by system with time-stamped audit trails
• Data collection, storage, protection, audit trails, and retrieval.
• Production data historian with audit trails and reports
• SOP’s for life-cycle management

Functional Specifications for new systems must include 21 CFR Part 11-related requirements, and qualification testing must clearly challenge and document them.

Production data is acquired primarily from manufacturing equipment instrumentation via process logic controllers, as well as operator workstations and interfaced peripherals such as barcode readers. Production data is also generated in the form of batch identification and target parameters from the recipe and batch databases. Interfacing manufacturing execution systems (MES), and enterprise resource planning systems (ERP) may also provide data for the batch record. Each data parameter value within the electronic batch record must be traceable back to its specific source, and it must be linked to its production batch identifier. All critical data collection functions must be qualified to ensure the integrity of the data acquired by the SCADA software system and stored in the resulting electronic batch records.

Loss of critical data associated with a lot or batch can result in the loss of product. Data security and automated periodic or real-time backups of production data should be implemented to prevent data loss. Historical production records must remain accessible and readable throughout the lifecycle of the system or the record retention period required by regulation, whichever comes first. Finally, all data records must be comprehensive, complete, bound to their associated records, and easily retrievable for audit by the FDA.