SPECIAL OFFER! Join ISA now and get the rest of 2023 Free.

  • By Bill Lydon
  • InTech
Data integrity
A partnership of business processes and technology

By Bill Lydon

During the 2017 ISA Food and Pharmaceuticals Industries Division Symposium, I attended an informative presentation, Data Integrity: A Partnership of Business Processes and Technology,- by Marcus Massingham, I.Eng. This article describes his presentation.

Massingham is the head, quality, lab and manufacturing technology, at GlaxoSmithKline's Zebulon, N.C., pharmaceutical facility. He has more than 25 years of experience in the design, implementation, commissioning, and validation of automation and information technology (IT) systems across a range of industries. Massingham is an electrical engineer and registered as an Incorporated Engineer in the U.K.

Data integrity is fundamental, because patients and consumers rely upon the pharmaceutical industry to research, innovate, manufacture, and distribute medicines and products at very high standards. All of the industry's supply chains are underpinned by an ever-growing magnitude of data. Regulatory focus and expectations are increasing on the data life cycle within the broader area of managing and maintaining data integrity. Automation and IT systems can support a number of the technical and business processes in this area.

Importance of data

Organizations create huge amounts of data, typically hundreds and hundreds of gigabytes a day. This includes enterprise systems and factory systems and may include paper data as well as electronic data. A lot of people think of data integrity as purely electronic, but it is important to remember that it includes the paper as well. Manufacturers have the obligation to maintain the integrity of the data all the way through the supply chain to the patient. For example, two years after delivery to the customer there could be a regulatory review. The manufacturer needs to prove that the medicine produced was correct using data genealogy and that the controls delivered a product that met all specifications and quality requirements.

Data provides direct evidence that produced products are safe and effective-telling the story of the medicines long after they have been shipped.

Data integrity

Data integrity is about trust in the data through all the process steps.

Massingham discussed the acronym ALCOA, which has been around since the 1990s and is used as a framework for ensuring data integrity. It is key to good documentation practice (GDP). ALCOA relates to data, whether paper or electronic, and is defined by the U.S. Food and Drug Administration guidance as attributable, legible, contemporaneous, original, and accurate.- These simple principles should be part of the data life cycle, GDP, and data integrity initiatives. He provided a timely refresh on the fundamentals.

Attributable

All data generated or collected must be attributable to the person generating the data. This includes who performed an action and when it was done. This information can be recorded manually by initialing and dating a paper record or by an audit trail in an electronic system. For example:

  • During a validation exercise, test results should be initialed and dated by the person executing the test.
  • Adjustment of a set point on a process or monitoring system should be made by an authorized user, and the details of the change logged in an audit trail.
  • A correction on a lab record should be initialed and dated to show the time and the person who made the adjustment.

Note: It is important to maintain a signature log to identify the signatures, initials, and aliases of people completing paper records.

Legible

All data recorded must be legible (readable) and permanent. Ensuring records are readable and permanent assists with accessibility throughout the data life cycle. This includes storing human-readable metadata that may be recorded to support an electronic record. For example:

  • GDP will always promote the use of indelible ink when completing records.
  • When making corrections to a record, use a single line to strike out the old record. This ensures the record is still legible.
  • Control your paper records and forms and format them with ample room to record the information.




Contemporaneous

Contemporaneous means to record the result, measurement, or data at the time the work is performed. Date and time stamps should flow in order of execution for the data to be credible. Data should never be backdated. For example:

  • If executing a validation protocol, perform tests and record their results as they happen on the approved protocol. Consider calibrating an instrument 35-feet up above in the factory. I have the technology sign that record at that particular moment. The moment I created the record of that information on a sticky note, paper notebook, or other place, I have created a primary record, which the regulators are interested in. Data that is logged\, or testing that is performed electronically, should have a date/time stamp attached to the record.
  • Make sure electronic systems that log data have synchronized system clocks.
  • Consider using a master clock system that synchronizes to the IT network, so wall clocks within labs and processing areas are synchronized.

As an industry, these are challenges we need to spend more time thinking about-how we manage raw data and metadata, and efficiently capture it into computer systems.

Original

Original data, sometimes referred to as source data or primary data, is the medium in which the data point is recorded for the first time. This could be a database, an approved protocol or form, or a dedicated notebook. It is important to understand where your original data will be generated, so that its content and meaning are preserved. For example:

  • Ensure validation test results are recorded on the approved protocol. Recording results in a notebook for transcription later can introduce errors.
  • If your original data is handwritten and needs to be stored electronically, make sure a true copy- is generated, the copy is verified for completeness, and then it is migrated into the electronic system.
Accurate

For data and records to be accurate, they should be error free, complete, truthful, and reflective of the observation. Editing should not be performed without documenting and annotating the amendments. For example:

  • Use a witness check for critical record collection to confirm data accuracy.
  • Consider how to capture data electronically and verify its accuracy. Build accuracy checks into the design of the electronic system.
  • Place controls and verification on manual data entry; for example, temperature results can only be entered within a predefined range of 0-100°C.
Common issues

Here are a couple of common examples where ALCOA is not followed, resulting in poor documentation and data integrity issues:

  • Frequently, data is quickly jotted down on a sticky note or on a notepad during testing. This data is then transferred to the approved protocol or form. Doing this, whether for lab results or a validation exercise, means the data is no longer original or contemporaneous, and it is potentially inaccurate.
  • When making a correction to information, it is common to see the old data scribbled out, overwritten, or removed using correction fluid and sometimes without an initial and date of who made the correction. This means the data is no longer legible or original\, and the correction is not attributable.

The most efficient approach is to catch deviations and problems to mitigate the root causes and eliminate the problem permanently. This is easy to state, but sometimes quite challenging to accomplish, particularly with the complexity around some products.

Information protection

Information protection is a growing concern with more attacks on systems, requiring many good controls to be in place. It only takes one email and one click to compromise the system. If the system is compromised, will this also compromise the data? If it does compromise the data, will people know?

All these programs have to operate in harmony. Ideally, you prevent data integrity from being compromised. Prevention includes training, controls, events computer systems, enforcing the workflow, and other measures. The goal of training is every employee understanding the meaning of data integrity and taking personal accountability of the data being recorded in any given action. Ultimately, if you cannot prevent attacks, you need mechanisms in place to identify issues early, coupled with action plans.


Approach to data integrity


Data integrity program: Hierarchy

In Massingham's company, there is very senior management sponsorship from the top of the company throughout the organization. There is a formal quality council that consists of the quality group heads, steering team, and distinct prevention and detection groups, as illustrated in the functional organization chart.

Massingham emphasized users should make sure data integrity is defined in purchase specifications for automation, controls, and equipment. He noted, It is amazing how many installed systems today have challenges to meet the latest regulations in place.- Data and documentation have two distinct groups: one that deals with electronic data and one with paper documentation. Electronic data includes IT, lab systems, and manufacturing systems, and the scale is enormous. In his company, it is within the range of 50,000 systems.


Data integrity hierarchy


 

Data integrity applied to systems

System review category themes to consider:

  • Access/security: Do you have the right access and security controls in place for your manufacturing systems? Are users assigned the correct roles, and are system administrators correct?
  • Audit trail review: Catch errors and mistakes.
  • Repeat testing: Make sure to capture all the raw product testing data and that audit trail information is stored in a system for later review.
  • PC clock/time stamp: Make sure the time of all devices is synchronized to the master clock for accurate time stamps.
  • Event/alarm: Much can be learned from alarms, so consistency and good alarm management practices are important. ISA standards, most notably ANSI/ISA-18.2-2016, Management of Alarm Systems for the Process Industries, can really help end users achieve a strong program.
  • Backups/archives: Backups and archives are a critical part of all the automation that need to be double checked regularly to ensure it is current.
  • Change control: Make sure changes are documented.

Examples: Electronic systems are designed to consistently perform processes. Data can be checked for accuracy and completeness at time of initial entry through retention period.


 

Questions to consider

  • At what point are pass/fail results known by an analyst? Can they stop the process before anything is recorded?
  • How is the data backed up and archived? Does it include all raw data or only a printout or PDF?
  • How is data reviewed (paper only or go to the instrument)?
  • Are there any system folders other than the one the instrument is routinely using?
  • Can data be erased from the system? Can data be renamed? Can times or dates on the computer be changed?
  • Look at configuration settings: Is overwriting allowed?
  • How is access to the system controlled? What are the access levels, and who has them?
  • How frequently do they review the access levels?

Data mapping

Data mapping is important for understanding the path information takes with raw data existing in many formats. For example, the chromatograph on the right side of the drawing could be printed as a PDF and used within the master batch record and lab records. That printout is static and does not indicate the history or record of what occurred. That information needs to be electronically fed back to the laboratory information management (LIM) system and the chromatography system, and the system needs to demonstrate all the metadata exists that illustrates how that testing was performed from the start through the release.

Paper documents can be scanned and stored electronically so they can be easily retrieved later as records for regulators to prove product was made within specification.



 

Standardization and simplification

Standardizing before implementing a manufacturing execution system to understand data flows and workflows will give you a much higher probability of a successful implementation.



 

Audit trails

Audit trail information can reside on all types of disperse systems, and it can take time to pull together an analysis. You need to have the right subject-matter experts, which can include the system administrator, system owner, process owner, and technical expert, do a complete review. Then look for any kind of pattern that shows some type of abnormality, a trigger point that is a clue to dig deeper. Audit trails encompass a wide range of systems, including control systems, execution systems, operational systems, analytical systems, and instrumentation.



 

Data is fundamental

Data provides direct evidence that our products are safe and effective. Data tells the story of our medicines long after they have been shipped. Proper practices, systems, procedures, and training are important, because regulatory expectations and focus on data have increased. Business process simplification and standardization is an important step. Technology complexity is increasing, making investment in training and people important to ensure the necessary skills exist to meet the regulators' requirements. Automation and IT systems configured, operated, and maintained correctly- can address many data integrity challenges. Automated audit trail reports as part of the standard application software is a real opportunity to help industry.

Reader Feedback


We want to hear from you! Please send us your comments and questions about this topic to InTechmagazine@isa.org.



Like This Article?

Subscribe Now!

About The Authors


Bill Lydon is an InTech contributing editor with more than 25 years of industry experience. He regularly provides news reports, observations, and insights here and on Automation.com