What Bugs you about Validation?

Hi all,

We are currently writing an article on what aspects of validation, do you find most frustrating:

If you contribute to the list, you will get a reference in the article when it goes live:

I’ll get the ball rolling!

1. Duration of Validation Projects
Why does it always take so long to complete a validation project. Is it bad management, lack of understanding of what needs to be validated or are validation projects typical of most projects they always go overtime and over budget.

2. The Validation Review Process
How many reviewers does it take to review a validation protocol. Is it necessary to have numerous managerial signatures on each validation document?

3. Poorly Written Protocols
One of the pet hates here at AAV is when we come across hugely complicated protocols that are so
difficult to understand. Why in general do people make protocols that are a nightmare to execute!

4. Senior Validation People
Why do senior people always believe their way it the best, and are unable to listen to other rationale ways
of performing validation.

5. No Time for Dry Runs
Another pet hate is when you hear "We don’t have time to dry run, lets go into execution. This really is a bad move, when you end up with numerous deviations and re-executions slowing down the whole process.

6. Risk Assessments - What the hell are they?
How many projects are actually risk assessed correctly before proceeding, the norm in this industry is that people again do not understand what needs to be validated, and they end up validating everything!

So now is your chance, to add your pet hates!

Please respond to this post with at least one thing that really annoys you about validation, and how we can improve these problems.

Best Regards

What bugs me about validation is that it is sometimes treated as an after-thought. I recently held a go-live event on a major project that introduced changes to a validated environment. My team got notification of the go-live activities 3 days before the cut-over activities were to start. A quick review of the documents and the project activities raised major concerns causing the go-live events to be pushed to a date 4 weeks later.

I spend a lot of time as an “evangelist” selling the benefits of including our team as early as possible in any project. We are not here to make it harder, but to figure out what people want to do and find a way that is compliant with regulations and our own business practices.

As for validation activities themselves, what bugs me most is when people go overboard testing every click and entry during a validation. The team I currently work with had just gone through a major validation effort of 3 enterprise applications: ERP, PLM, and Inspections. These were all configurable off-the-shelf applications. Aside from the functionality, behavior and data resulting from the configurations required to meet intended use, the team tested - as in instruction, expected result, actual result, initials/date - standard application functionality. That is, there were steps to confirm standard logins, screen navigations, data entries. Granted, if these are considered important to intended use then some level of testing is granted, but limits, positive/negative, field length?

I guess my point is that without a good rationale for including or excluding something from the validation tests goes a long way to focus the validation on things that are really important.

I hope this helps. ** Mike **

[quote=mromeu]What bugs me about validation is that it is sometimes treated as an after-thought. I recently held a go-live event on a major project that introduced changes to a validated environment. My team got notification of the go-live activities 3 days before the cut-over activities were to start. A quick review of the documents and the project activities raised major concerns causing the go-live events to be pushed to a date 4 weeks later.

I spend a lot of time as an “evangelist” selling the benefits of including our team as early as possible in any project. We are not here to make it harder, but to figure out what people want to do and find a way that is compliant with regulations and our own business practices.

As for validation activities themselves, what bugs me most is when people go overboard testing every click and entry during a validation. The team I currently work with had just gone through a major validation effort of 3 enterprise applications: ERP, PLM, and Inspections. These were all configurable off-the-shelf applications. Aside from the functionality, behavior and data resulting from the configurations required to meet intended use, the team tested - as in instruction, expected result, actual result, initials/date - standard application functionality. That is, there were steps to confirm standard logins, screen navigations, data entries. Granted, if these are considered important to intended use then some level of testing is granted, but limits, positive/negative, field length?

I guess my point is that without a good rationale for including or excluding something from the validation tests goes a long way to focus the validation on things that are really important.

I hope this helps. ** Mike **[/quote]

Great points Mike, I totally agree with getting involved early on, why is validation always seen as an after thought!!

Also the in-dept testing of some software applications amazes me, its a true sign of rushing into testing without having a robust risk assessment performed.

Nice post.

What bugs me out is the traditional approach of senior folks who are supposed to lead the Validation team. Most of the equipments we use these days are custom built and there will some extra steps to be validated in such new instruments.These instruments or equipments will have an extended validation or they might change some paramaters during validation to reconfirm the suitability of such a device for long term verification process. The old guys are stubborn and often tend to ressist the changes.As more subject matter experts are joining the validation groups in an orgaization the team leaders must be flexible and should be geared up for such changes.No instrument is similar to other instrument in the company or our earlier experience. Fixed mind set is always bad for such validation process.

Nice point Mr Druga, yes it seems to happen everywhere “We always do it this way - What do we need to change?”

Comments from David Stokes from LinkedIn
http://www.linkedin.com/groupItem?view=&gid=51823&type=member&item=48884982&commentID=35441985&report.success=8ULbKyXO6NDvmoK7o030UNOYGZKrvdhBhypZ_w8EpQrrQI-BBjkmxwkEOwBjLE28YyDIxcyEO7_TA_giuRN#commentID_35441985

What I find most frustrating is that many Regulated Companies still don’t understand how important validation planning is for cost effective, risk-based validation. Many have seen ‘risk-based’ as being synonymous with ‘cheap’.

I would be the first to argue that risk-based validation has some good opportunities for cost savings, but only if companies use the right resourcing model i.e. use their own validation SMEs (or experienced consultants) to

  • Apply a risk-based approach to the validation planning, define the validation activities, roles and responsibilities and validation deliverables
  • Have input to key documents such as the Test Strategy
  • Prepare the Validation Report
  • Provide guidance and support

Lower cost resources can then be used for the day-to-day activities such as:

  • Verifying requirements
  • Reviewing design documents for consistency and completeness
  • Maintaining traceability
  • Reviewing test scripts

This is a real, cost effective, risk-based approach.

However, I see too many Regulated Companies trying to use the wrong resource for the wrong activities (i.e. I need one FTE for the duration of the project) - either roles for which lower cost resources are inexperienced and under qualified, or roles for which resources are over experienced and over qualified (and therefore too expensive).

Comments from John Wall LinkedIn
http://www.linkedin.com/groupAnswers?viewQuestionAndAnswers=&discussionID=48884982&gid=51823&commentID=35442417&trk=view_disc

  1. Appling GMP/GLP documentation standards to CSV where a test script looks like a lab notebook. Not that there is anything wrong with it. It is just that it is too much and not required.

  2. Being forced to provide enough evidence that so someone who didn’t watch you do the script can see that somebody did it is also overkill. All this defeats the purpose of testing.

  3. The real testing and the real validation happens when the test scripts are being created. Running them over and over until all the descriptions are corre

One of the issues that I have had in the past with validation programs is that sometimes they are not organized very well. The validation of a manufacturing process should tell a story. It should have a beginning and an end.

The validation records should flow with the process. The first station in a production cell should be the first to be validated. Within that station the flow should be IQ, OQ then PQ. Then each subsequent station in the cell should be validated in the order the material flows with the same sequence: IQ, OQ & PQ.

I have audited processes where the “story” of the validation process does not flow and the general logic is hard to follow. Validation records were not organized in a way to easily show the flow of the validation “story” within the production cell. The output of one station is the input to the next. This should be a logical validation flow in the records/reports.

Another issue that I find some people have difficulty in is validating that the equipment can process acceptable variation within the material.

Comments from Juan Oscar Perez

One of the thing that really bugs me is that in most places it is still a brick & mortar process (aka as paper based process). Even though there are very good solutions to manage validations that can measure and control the approval and execution of validation tests, companies are still using paper.

[quote=Doug Phelps]One of the issues that I have had in the past with validation programs is that sometimes they are not organized very well. The validation of a manufacturing process should tell a story. It should have a beginning and an end.

The validation records should flow with the process. The first station in a production cell should be the first to be validated. Within that station the flow should be IQ, OQ then PQ. Then each subsequent station in the cell should be validated in the order the material flows with the same sequence: IQ, OQ & PQ.

I have audited processes where the “story” of the validation process does not flow and the general logic is hard to follow. Validation records were not organized in a way to easily show the flow of the validation “story” within the production cell. The output of one station is the input to the next. This should be a logical validation flow in the records/reports.

Another issue that I find some people have difficulty in is validating that the equipment can process acceptable variation within the material.[/quote]

Well said Doug, couldn’t agree more, alot of time it seems like a paper exercise!

The complete absence of senior planning managers and project managers to understand that sometimes equipment and processes can fail validation, that contingency plans need to take this possibility into account, and that it is not the testers role to ensure that the equipment does pass (and thus not his fault if it fails), but merely to test the equipment / process to see if it meets the required standard.

What bugs me mainly is the fact that senior management do not seem to realise that to do the job properly takes time. And as we know a validation study is no two minute job in the first place.

Why are validation studies always an afterthought ?

I struggle with validation team members who seem to be unable to determine what is truly important with respect to validation/qualification. Frankly, I do not care whether someone uses a gel pen or not (to cite one example). I want to know if the system operated in accordance with specifications and is fit for its intended use.

I am also frustrated with working on projects with no or poorly defined user requirements, especially when the user determines mid-project that the solution does not suit their business or quality needs. User requirements before we buy the solution, please!

[quote=R Paules]
I am also frustrated with working on projects with no or poorly defined user requirements, especially when the user determines mid-project that the solution does not suit their business or quality needs. User requirements before we buy the solution, please![/quote]

Clear User Requirements - Does that ever happen!!!

Great point R Paules

Comments from Chris Whalley

Regarding electronic validation systems, they may be good solutions but they are more complicated for your average user than paper. I work with brilliant people with PhD’s in all kinds of disciplines and they can BARELY create Word documents using styles. This is a tool they use every day, their entire career and they still don’t know its features. Now you want to add an entirely new system? Good luck. I know this is a sweeping statement but most non-IT folks I’ve met don’t get the value of electronic systems and most validations aren’t IT. I wish it weren’t so.
In terms of what bugs me about validation, poor user requirements! When user requirements are poorly written, the benefits of validation decrease and the costs increases. I tend to see user requirements that define solutions before the problem is fully understood. Sigh.

Everything man…

Comment from Jim Agalloco:

That the science behind our processes is often tossed aside in order to satisfy an arbitrary compliance expectation.

Posted by Paul Derbyshire:

What bugs me about validation? That people expect the act of validation to prove a system or process works but don’t tell you it does not work before you get to it.

Posted by Melanie Cooper

The problem is everybody in the team including the top management should understand what validation is.

For example, PMs do not understand that validation involves documentation (no matter what) and a process to be followed.

Ofcourse, PMs complain about overkills while doing validation, but it is a very small percentage of all the tasks involved in implementing a project.

It is the responsibility of the PM to understand and allocate time and resource for these activities.

Almost in all the projects I have worked on, it looks like projects are delayed because of Validation folks and their silly idea of following a process.

This is a very bad mentality because the PM although they claim to understand validation are more concerned about delivering the projects on time and save their tails.

Posted by Juan Crespo

It can cause difficulties when the PM does not understand the validation requirements, and the validation is not planned into the project. Also many cases the resource owner also forgets abut the validation and the system is already installed when the validation process starts.

One more bothering thing is that the validation is the last step before go live. There can be many delays during the previous steps, but the validation shall be finalized in time.