Features

Data Integrity: A Practical and Risk-Based Approach

Ways an organization can form a culture and cost-effective means to meet the various challenges in sustaining data integrity.

By: Robert Marohn

Director of Quality Business Systems, Kite Pharmaceutical

In the recent JulyAugust issue, Contract Pharma looked at the issuance of data integrity guidance documents by four of the world’s leading regulatory agencies. Taken together, these guidelines stipulate an array of data integrity and governance expectations.

All four of the regulatory agencies’ directions on data integrity make the point that it is important for an organization to consider the sum total of measures that ensure data integrity, irrespective of the process, format or technology in which data is generated, recorded, processed, retained, retrieved and used throughout the data life cycle, and that these preparations guarantee complete, consistent and accurate records.

This article looks at some of the ways an organization can establish a culture and cost-effective means to meet the various challenges in sustaining data integrity. Particular emphasis is placed on a practical and risk-based approach toward data integrity governance. Sensibly, data integrity governance can be accomplished through: marshalling the proper expertise and thoughtful planning, training and leveraging cGxP knowledge, assessment and risk analysis, and self-inspection and remediation.

Expertise and planning
According to Global ISPE Data Integrity Special Interest Group of the ISPE GAMP Community of Practice, “If a company does not have in-house experience for implementing data integrity, or has failed in past implementation attempts, they should bring in outside expertise to help guide the initial states of implementation.”

Organizations must get a grip on data integrity governance across the data life cycle by initially focusing on expertise and planning. “The big lesson in the industry is that in order to ensure data integrity, quality must be managed at the entry point of data from the beginning of developing an active pharmaceutical ingredient (API) to the information appearing in the product label,” said Dr. Nancy Pire-Smerkanich, assistant professor in the Department of Regulatory and Quality Science in the School of Pharmacy at the University of Southern California. “It is very difficult to build quality back as the process approaches the end of the data life cycle.”

For many organizations, a starting point is to identify a data integrity and governance lead—in larger organizations this will likely be a full time data governance officer—and the establishment of a cross-functional team to bring focus on the principles of data integrity across the data life cycle and organization. In the same way, an effective quality system must have the support and active involvement of top management, so too must an effective data integrity program have executive commitment.

In the absence of expertise, harnessing external know-how will save time and money in the long run and set the organization on a steady path toward a mature data integrity governance. However, the objectives must be to transfer outside expertise and knowledge to experts in-house and move toward a company culture embracing data integrity governance. Organizations should avoid completely outsourcing a data integrity governance program to a third party vendor, which would indicate the company is not taking ownership of its own records.

A cross-functional team’s purpose is to ensure that data integrity approaches are included or continue to be a focus used in methods and processes throughout the organization. These teams can help establish internal expertise by organizing and gaining support for data integrity subject matter experts to be embedded in functional groups and on product lines.

If the organization does not have data governance and integrity as a concept built into the quality framework as a whole, the data governance lead and cross-functional team should immediately consider putting in place a data governance policy and plan for the organization. The first draft of such a policy will focus on gaps in the organizational approach, while referencing quality procedures already in place as they relate to existing cGxP data integrity requirements. The long-term objective should be for a holistic data integrity approach to be integral to the overall quality approach.

A data governance policy should call for a data governance plan which lays out, among other things: goals and objectives of data integrity governance; organization and data ownership; a strategic approach to the organizations data life cycle and other important elements such as incident and problem management, access and security management, and a quality risk framework. Other supporting processes to consider will be auditing, metrics, inventory classification, validation, and training. An FDA inspector will view a data governance policy and plan as a strong commitment by the organization to sustaining data integrity.

Training and leveraging cGxP knowledge
According to FDA, “Training personnel to detect data integrity issues is consistent with the personnel requirements under §§ 211.25 and 212.10, which state that personnel must have the education, training, and experience, or any combination thereof, to perform their assigned duties.”

Throughout the draft, “Data Integrity and Compliance with cGMP Guidance for Industry,” the FDA refers to “requirements with respect to data integrity” in predicate rules and electronic signature and record-keeping requirements. Mary Lyda, former FDA officer and current vice president, global quality assurance, Accelovance, Inc., said, “It all goes back to training. It is essential that everyone in an organization understands the fundamental principles of data integrity within cGxPs and that the FDA and other regulatory authorities expect data to be reliable and accurate.”

Most organizations will already have a robust training program around these regulations that can be leveraged. Training on the importance of data integrity principles may be a matter of refreshing the work force’s knowledge of concepts such as ALCOA—an acronym representing the following data integrity elements: attributable, legible, contemporaneous, original, accurate—referenced in all four guidance documents—and how to properly report errors, omissions and abnormal results.

“Companies are waking up to the fact that data integrity issues are not isolated to countries like India and China and often have to do with cultural nuances,” said Ajit Simh, adjunct professor in the Department of Regulatory Sciences program at California State University of San Diego. “So management needs to make the effort to understand why individuals may be making data integrity errors. For example, back dating in some cultures is seen as a way of ‘saving face,’ which may be a higher motivator than accurately recording data.”

“General staff training should not be overlooked since it provides the critical foundation to achieve a state of understanding for doing the right things rather than policing and implementing IT barriers to prevent the wrong things,” according to the ISPE GAMP Community of Practice. 

In addition, the data integrity cross-functional team, with concerted help from the quality and training department, should orient training material on the goals and objectives defined in the data integrity organizational plan with practical examples of how to achieve parity with the organization’s aspirations.

Assessment and risk analysis
According to FDA, “Firms should implement meaningful and effective strategies to manage their data integrity risks based upon their process understanding and knowledge management of technologies and business models.”

Regulated data and records should be identified and documented in preparation for assessment and risk analysis. An excellent way to prepare includes a data flow analysis to identify the role of elements or units in regulated processes. Completing this as early as possible, such as during a system specification phase is ideal, but may be more difficult from a retrospective approach. In addition, a risk framework commensurate with the company sector should be determined.

Many companies are now going back and completing or updating reviews of data integrity by preparing a corporate level questionnaire template to be used to evaluate and prioritize systems in areas such as data integrity impact, system inputs for accuracy, implemented controls, system outputs for accuracy, security, and 21 CFR Part 11 compliance, with a focus on audit trails. Examples of questions that may appear on a questionnaire related to data integrity impact might include, but are not limited to:

  • Can misinterpretation of product quality, safety or efficacy result from corruption or loss of records?
  • Can the product be adulterated, or can the release of adulterated or quarantined product result from corruption or loss of records?
  • Can the product be misbranded as a result of corruption or loss of records?
  • Can the ability to recall the product be compromised by the corruption or loss of records?
The uniformity of such a template allows for standardization and consistent implementation of the Data Integrity Governance Policy, procedures and plan, as well as a better risk evaluation of data for decision making and prioritization of where resources and money should be spent.

According to TGA, “Processes for the access, generation, control and review of electronically generated data and records, including, but not limited to system validation, configuration and ensuring reviews of source data and audit trails are routinely performed, based on risk.”

The FDA and other regulatory agencies have advocated a risk-based approach for some time now and applying this to data integrity governance is no exception. “Building data integrity into the quality framework, if done earlier in the process, will save time and money and enable better benefit-risk assessments,” said Dr. Nancy Pire-Smerkanich. “Formalized benefit-risk frameworks have value to enhance transparency and support decision making, but are not utilized enough in the industry.”

Selecting a risk framework should be in line with the industry sector and balanced with other quality resource demands. More specifically, said, Dr. Pire-Smerkanich, “Manufacturers and analytical laboratories should design and operate systems which provide an acceptable state of control based on the data integrity risk, and which is fully documented with supporting rationale.”

A risk approach can become an important support factor in balancing data influence decisions and the impact of data to product quality or safety. When carrying out a batch release for example, a manager may need to decide if data, which determines compliance with critical quality attributes, is of greater importance than warehouse cleaning records. Or, a risk framework may show that an oral tablet, active substance assay data is of generally greater impact to product quality and safety than tablet friability data.

Other considerations when selecting a risk framework should include: the nature and complexity of the associated business process, involvement of automation, manual interference with systems, the subjective nature of outcomes, and other ancillary factors that increase likelihood of data integrity failures.

In the end, inspectors and auditors will be assessing whether controls and review procedures achieve their desired outcomes, and risk frameworks play an important role in such procedures. Without doubt, an organization that thinks there is no risk of data integrity failure is likely to have not made a satisfactory evaluation of inherent risks throughout the data life cycle.

Self-inspection and remediation
According to PIC/S, “The effectiveness of data integrity control measures should be assessed periodically as part of self-inspection (internal audit) or other periodic review processes. This should ensure that controls over the data life cycle are operating as intended.”

MHRA has said, “QA should also review a sample of relevant audit trails, raw data and metadata as part of self- inspection to ensure ongoing compliance with the data governance policy/procedures.”

A continuous self-inspection program is an important element of overall data integrity governance. Aside from the standard data verification checks built into normal controls, organizations should incorporate a wider range of measures: training checks, consistency and risk-based sample checks, and quality metric tracking.

Because people in the “people, process, technology” model are frequently the weakest link, it is important to continually check personnel understanding of data integrity principles and expectations. Along with ongoing training programs with measurable outcomes, this can be done by discussing inspection results in quality review meetings and linking results to the context of product quality, safety and efficacy. Ultimately, organizations must set expectations and verify the understanding of such expectations, before holding people accountable.

Multiple methods to detect errors should also include consistency checks of reported data/outcomes against raw data entries. In cases where data is reviewed by a validated “exception report,” risk-based samples should be considered, for example for computerized system logs or audit trails. This is important to ensure that information germane to cGxP activity is reported as expected.

Finally, a set of quality metrics related to data integrity governance will provide upper management insight into the extent and frequency of issues the organization may be experiencing.

“Companies really struggle with not just simply detecting data integrity issues, but the complexities of how to detect them,” said Mr. Simh. “Developing good quality metrics to ensure management understands the direction and trends related to data integrity issues is paramount.”

Quality metrics should be designed not only to review data generated from particular systems or locations, but its quality across the entire data life cycle. This will introduce concepts such as measurability over a range of dimensions, variability, inconsistency from point-to-point, and quality of data to fulfill its intended purpose as it transforms through its life cycle. Utilizing a data historian and analytical program, although difficult to set up, can quantify outcomes and provide a return on investment in the long run.

“FDA encourages you to demonstrate that you have effectively remedied your problems by: hiring a third party auditor, determiningthe scope of the problem, implementing a corrective action plan (globally), and removing at all levels individuals responsible for problems from cGMP positions.”

Much has already been written about how the FDA and other regulatory bodies have stepped up their scrutiny and observations of data integrity issues. A preponderance of recent warning letters and observations has provided a view into regulatory expectations. When remediating data integrity issues, the primary consideration should, of course, be resolving the immediate event and evaluating the risk associated with the problem. A plan of action should include a comprehensive investigation into the degree of inaccuracies in data records and reporting. Actions carried out will be, for example: defining the scope of the incident; interviewing relevant employees; assessing deficiencies; identifying specific products implicated; comprehensive reporting on all parts of the operation affected; undertaking root cause analysis; and conducting further risk assessments on the observed failures. The plan of action should also include “corrective and preventative” actions to address data integrity vulnerabilities as well as timeframes for implementing interim and long-term measures of remediation.

Over and above a management corrective strategy, regulatory agencies will be looking for organizations to make full disclosure of any and all identified data integrity issues. They will also examine if the scope of corrective action warrants a global perspective to determine if the issue is systemic.

Conclusion
Assimilating regulatory expectations for data integrity throughout the data life cycle makes plain why guaranteeing complete, consistent and accurate records is a challenge for any organization. While a detailed and comprehensive approach to data integrity controls goes beyond the concepts presented in this article, it is imperative that data integrity ultimately be built in to the quality processes and routines of the organization.

Ongoing awareness through dissemination of data integrity related information and continuous training of the latest concepts will strengthen employee understanding of the organization’s expectations. Placing data integrity in the forefront of a quality program is a necessary step toward creating a culture where trustworthy and reliable data is the result of second nature behaviors and habits of all employees. 


Robert Marohn consults with organizations on all aspects of developing and implementing comprehensive quality information technology policy/procedures, frameworks, computer system validation, and data integrity plans and programs. He can be reached at rmarohn@clinlogic.com.

Keep Up With Our Content. Subscribe To Contract Pharma Newsletters