A record which is complete, readable, contemporaneous, original, accurate, and attributable must be maintained to ensure the integrity of critical data generated in production throughout the entire data life cycle.
Maintaining data integrity is especially crucial in sectors such as the medical device industry because a life cycle can last up to 90 years (e.g., a patient’s lifetime).
The automated systems industry has incorporated tools into its systems to help the pharmaceutical and medical device industries to meet regulatory requirements and to prove compliance during a system validation process.
The requirement in the industry to keep all relevant GxP data (impact on quality), whether in electronic or especially in paper form, stored throughout the product life cycle, can be problematic.
In some cases, where data is stored on paper for long periods of time, it is common to keep that data in specially prepared rooms to prevent data damage. However, this approach can hinder digital transformation of data, making storage expensive and inefficient.
In the case of electronic data storage, it is important to consider issues regarding storage for long periods (such as choosing a media on which the data will be recorded).
Ideally, data should be saved in a cloud infrastructure that provides a redundancy structure and backup to ensure data security and availability.
In some cases, it is common to find automated systems in factories that were implemented over 20 years ago.
These systems may have been manufactured by vendors that no longer exist and may use different industrial communication standards and protocols for data storage.
In addition, it is common for this type of data storage to be protected by passwords created by professionals who no longer work for the company and can result in an obstacle when trying to access, view, and certify the integrity of this data.
Often, these systems have been developed without the necessary attention to generated data flow; planning data origin and destination; and considering how this data would be accessed in the future.
As a result, it is common to find critical data stored in different ways, sometimes in two different databases. This can lead to problems identifying which data is correct, which data was saved in real time, and which is just a copy.
The lack of a single data repository can be problematic.
If someone gains improper access to a database and is allowed to view data or files, such as a text or a CSV file, it is likely that this user can compromise the data.
For example, if someone manages to gain improper access to a password-protected Excel file, the password can be easily cracked in a short amount of time using tools commonly available on the Internet. If data should be manually entered into the system, due to the nature of this action, the more data that is entered in this way, the more possibility of human error.
This, therefore, results in the more time-consuming process of validating systems and all related tasks, including checking for associated risks and implementing manual, mitigating measures to ensure data integrity. Discover how FIVE Validation can help your company on this journey to achieve accurate data integrity: CLICK HERE
Storing data in a supervisory system or in software that integrates data from different types of industrial automation hardware, without proper data protection at the instrumentation level, can result in the loss of that data in certain events, such as a network failure.
If data loss occurs at the instrumentation level and production is ongoing, the entire batch will be ruined, requiring a series of mitigating measures, such as opening and investigating deviations, before load can be released to the market.
During the design process, it is essential to establish a redundancy structure to avoid such situations.
A crucial aspect to always consider is reporting, which must have access to all necessary data.
For example, to release a batch of medicines, Quality Assurance must ensure this release based on production data, analyses, and events related to batch.
Reports should be designed to provide simplicity, such as the ability to report only exceptions and nothing else with a quick and simplified access.
Therefore, it is important that reports are flexible and adaptable. The best way to meet these requirements is by developing and implementing a new system from scratch.
As regulations become more stringent and cover more areas, it is essential to follow a set of standards to design an optimal automated system. The first step in this process is to standardize communication protocols, which consist of a set of rules that allow information to be transmitted between two or more entities.
When deciding which protocol to use, it is necessary to ensure that data is encrypted and secure. Data flow must be planned efficiently, including controlling access to the repository, and recording who and from where access has been granted.
In addition, the ideal system should cover all phases of production and be fully integrated. This allows different types of devices to be connected to a single platform at a time and enable a single, viewing tool, with an appropriate audit trail and access control capabilities.
For example, ‘if a professional is responsible for the company's packaging area, they should only have access to information related to their area of work without the need to access data related to raw materials used in production.’
In case of an error at any stage of a process, the ideal system should allow one to go back to the exact point where the error occurred and identify it. If the problem occurs in ‘distribution’, one needs to report how the production was stored, where, and at what temperature the product was.
If it is a batch-related problem, a system must be able to identify what went wrong and when, including whether the problem was caused by the raw material used, or by the inadequate training of an individual.
Ideally, all data generated during production should be entered and accessed electronically over a network, minimizing the need for manual entry.
However, this process can become challenging due to the large number of devices installed in a plant, which in some cases can number in the hundreds, requiring significant investment.
In some situations, systems used are old and do not have inputs for connecting devices to a network or even a wireless connection.
Furthermore, there are cases where vendors do not allow their devices to be accessed commercially by clients for data retrieval. Some vendors do not provide connection inputs on their devices, and there are also those that allow connections but use different communication protocols, making it very challenging to capture data.
To deal with the problems mentioned above, an initial approach would be to divide the project into smaller parts and conduct a pilot project in a specific area of manufacturing plant that is not crucial to production process. This would allow a system to develop gradually, with the possibility of adjustments and corrections along the way.
A scalable software which can connect different protocols and hardware needs to be installed, as well as a database that does not necessarily need to be on a single server.
One will probably need to implement redundancy, automatic backup, and a method of constantly monitoring a system to prevent data loss in case of failure. Sometimes, additional hardware must be installed, as the PLC (Programmable Logic Controller) or controller available in a plant may be too old to secure data generated in production.
Although this may seem to require large investments, it is arguably less damaging than the risks of a non-conformity in production.
Importance of data security can never be underestimated in a project.
From day one of a system implementation, security should be a priority and considered a part of risk analysis.
Leaving a security strategy implementation to a later stage of development can result in having one to go back to the start of a project and rethinking the whole engineering of a system.
It is important to protect a database, especially the parts of a system which do not run an antivirus program with features such as: properly configuring the login directory; removing USB ports and other entries that allow connection to the database; and maintaining public and private connections to a system within the I.T. and manufacturing plant infrastructure.
All these actions are part of a larger security strategy and should be considered at the beginning of any project.
FIVE Validation is a company specialized in computerized system validation, especially in industrial automation.
It has a vast experience of over 15 years and more than 1000 validation projects executed. Its team of 40+ employees is at your disposal, ready to utilize all their knowledge and expertise, ensuring that your automation project is executed successfully.
Would you like to schedule a meeting with one of our specialists? If so, CLICK HERE and check which date and time best fits your schedule!
FIVE not only provides services in the validation area but also has created a highly efficient digital validation system that can speed up processes 6x faster.
GO!FIVE® system comes with pre-prepared validation libraries that make compliance simpler.
Among the many libraries available, it offers those that help with the validation of industrial automation systems, such as cartonning and/or filling machines; Environmental Monitoring Systems (EMS); heating, ventilation and air conditioning systems (HVAC or BMS); OT Infrastructure Qualification; filter integrity testing; dataloggers; laminar flows; purified water generators; reactors; Purified Water (PW) distribution and Water for Injection (WFI); autoclaves; FDA 21 CFR Part 11 compliance; and others.
There are two other advantages that can be highlighted in using the GO!FIVE® tool to validate automation systems:
For more information, CLICK HERE!
SILVIA MARTINS is an electrical engineer with over 20 years of experience in Systems Validation. She was trained in England in GAMP5 and FDA 21 CFR Part11, in SAP validation in Germany, and in Data Integrity and Data Governance in Denmark. She coordinated the group that elaborated the 1st Guide for Computerized Systems Validation together with ANVISA, the Data Integrity Manual and the Cloud Qualification of Suppliers Manual (both at Sindusfarma). She has also provided training courses for VISA and ANVISA inspectors in Brazil. Currently, she is the CEO and co-founder of FIVE Validation.