In 2016, the U.S. Food and Drug Administration (FDA) published draft guidance on “Data Integrity and Compliance” with respect to Current Good Manufacturing Practice (CGMP). This Guidance was published in response to a recent string of violations. In 2016 alone, FDA CGMP Warning Letters increased from 41 (in 2013) to 102. That’s a 148% increase in warning letters served.
More than 45% of the warning letters issued in 2016 could have been avoided if it weren’t for data integrity deficiencies.
Some examples of the data integrity issues that were observed during recent inspections include:
- Altering original data and records
- Manipulating inadequately defined analytical procedure and associated data sets in order to obtain passing results
- Back-dating test results to meet the protocol requirements, thus creating acceptable test results without performing the actual test
- Using test results from previous batches in order to avoid testing of other batches
Through its inspection efforts, the FDA seeks to ensure that GMP data is complete, consistent, and accurate. Specifically, the draft guidance on “Data Integrity and Compliance” outlines a number of key points through a series of 18 questions and answers in Section 3 of the document. However, there are several areas worth highlighting as they tend to be frequently raised during initial discussions:
Section 3.1 defines terms such as data integrity, metadata, and audit trails.
Section 3.3 covers the question “Does each workflow on our computer system need to be validated?” (In short, the answer is “yes”. If you validate the computer system but don’t validate it for its intended use, you can’t know if your workflow runs correctly.)
Section 3.17 provides further examines whether the FDA investigator is allowed to look at your electronic records.
Section 3.18 outlines how the FDA recommends data integrity problems should be identified during 403 inspections, in warning letters, or in other regulatory actions be addressed.
In practical terms, the FDA wants to ensure that data supporting product registrations are reliable and accurate to safeguard the patient. However, maintaining a constant state of data integrity with respect to inspection readiness is difficult due to
- volume and variety of data,
- redundant manual operations, and
- no process standardization and oversight across departments and sites.
It’s time to start thinking about inspection readiness and compliance in terms of data—from governance to maintenance to accessibility.
How can organizations ensure traceability audibility and immutability of their information?
True cost of Data Management
The total cost of complying with global health authority regulations vary widely from company to company, however, the following represent typical costs associated with compliance:
- Preparation for an inspection
- Maintenance to prevent a failure
- Corrective actions required in the case of a failure
Systems today create data at an ever-increasing rate, and the analysts at IDC are expecting an increase in data by a factor of 50 in the next 10 years. There is already a significant cost burden to manage and store data, and the more data a company creates and maintains, the more expensive it is to manage.
Organizations have been slow to properly direct fund data management systems and processes due to a variety of reasons—silo organizations, lack of strategy, immature data tools, etc. HighPoint’s research suggests companies are “paying” for data management through both direct and—more importantly—indirect activities.
In a recent study, highly skilled workers say that 50% of their wasted time is spent hunting for data, correcting data errors, mining data manually, and searching for confirmatory sources for untrusted data resulting in inefficiency costs (in direct). By improving data access your analysts, regulatory specialists and compliance officers (for example) would have more time to focus on their core duties.
We understand the reality of budget constraints. However, many companies are wastefully spending already with additional head count to compensate for inadequate data systems that wouldn’t be necessary if with data management improvements.
- When data integrity is improved
- Cost of compliance is reduced
- Operational efficiencies are gained
- Better decisions can be made
Many organizations determine spending as a percentage of revenue, which also should be the case for spending as it relates to data management. As organizations grow so does their data.
Companies often don’t realize their compliance exposure because they are either not aware of or don’t understand their Data Integrity issues. When justifying the cost for addressing your Data Integrity issues, compare the extreme case product distribution stoppage or cost of extensive remediation against proactive measures that can be performed at a fraction of cost and have benefits throughout the organization.
From a Client’s Perspective
Our client, a large pharmaceutical organization, uncovered significant findings during a recent audit by a regulatory authority on their drug safety system and processes. The most interesting of the findings came from searching a set of standard safety line listing reports run directly out of the organization’s commercial drug safety system. The report showed a number of blanks spaces where medical products were meant to be displayed. Clearly, something was wrong with the underlying data.
Essentially, the client’s issues stemmed entirely from altering original records, the very first data integrity issue mentioned above. And once that issue was noted, the error cost exorbitant time (months) and money (several million dollars).
After an internal investigation into the root cause of the finding, it was determined that the failure was the result of a simple data integrity issue that was caused by members of the organization deleting product entries from their reference product repository in order to change the investigational name of the product to the commercial product name. The team members performing this deletion were never trained on how to change the product name properly. The cost to train those team members would have been very low and the effort to develop a controlled and repeatable process to perform the product name change would have been very simple.
The HighPoint Approach
Whether your company is an emerging biotech preparing for commercial launch or a leading pharmaceutical organization with established marketed products, everyone must prepare for inspection in order to develop, submit, and commercialize product. There are pragmatic measures that your organization can undertake to drive operational efficiencies through inspection readiness.
The development, submission, registration and commercialization of therapeutics is fundamentally related to the robustness and accuracy of the data submitted by the sponsor to the health authority.
The short version of the story is: When doing inspection preparation, look at your data integrity. The health authority certainly will be.
Such data must be complete, consistent, and accurate and comply with Good Manufacturing Practices (GMP), Good Clinical Practices (GCP), and Good Laboratory Practices (GLP). Ensuring the integrity of your data generated from GxP activities is of critical importance to maintaining a state of regulatory compliance and continuity of your commercial operations.
There are keys steps that life sciences organization can undertake to modernize operations that will improve efficiency and compliance via data integrity.
An effective data integrity strategy will enhance system/process efficiency across applications that house GxP related data thereby improving organizational efficiency and effectiveness in concert with regulatory, safety, and quality & compliance requirements as stipulated by Health Authorities. Such initiatives will provide your organization a competitive advantage as compared to those that choose to delay.