Calibration vs. Qualification / Validation

Hi All,

Just wondering if anyone could help me with a conundrum I have encountered. I’m not sure if there has been a previous thread on this, but I couldn’t find one.

The following is an example of an issue that may be encountered; it is possible to have a calibration error on a piece of equipment of 4 PSI. At the same time the validated range of this equipment could be 93 +/- 3 PSI. Therefore if the pressure on this equipment is adjusted to 90 PSI, it is possible that equipment may be running at 86 PSI. 86 PSI is outside the validated range of the equipment.

Just wondering if anyone knows if there is a standard approach which is taken in Industry to prevent such a situation.

Or if there is an approach to be taken when establishing process conditions / parameters, during process charachterisation and process validation.

Appreciate any input you may have.

Regards,
Enda.

Hi Enda,

I’m a little unclear of the question.

If the calibrated range of the equipment is 93 +/- 3 PSI, and you are getting a reading that is outside this for example 94 or 86, surely that means that you have a problem with either the equipment or perhaps the characterization that was performed.

Was this part of a validation study or a manufacturing run?

Regards

Hi G,

Thanks for the reply. I probably wasn’t that clear in my initial post.

This is a hypothetical situation.

I’m trying to get my head around the relationship between calibration and validation. And if there is an approach which can be taken when establishing process conditions / parameters, during process charachterisation and process validation to take calibration error into account.

1.) The validated range of the equipment is 93 +/- 3PSI. i.e. 90PSI - 96PSI. This is establised at PC and verified at OQ.

The calibration error on the equipment is 93 +/- 4PSI. i.e. 89PSI - 97PSI.

Therefore it is possible to set the equipment to nominal (93PSI), but run outside the validated range of the process. Eg. 97PSI.

2.) Another example is that the validated range of a piece of equipment is 90 +/- 10PSI. i.e. 80PSI - 100PSI.

The calibration error on this equipment is 90 +/- 2PSI. i.e. 88PSI - 92PSI.

If the equipment is set to the upper limit; 100PSI. Theoretically the equipment is running on the validated range, but due to the calibration error it could be running at the limit (100PSI) plus the error (2PSI), which is 102PSI.

Is there an approach that can be taken to take this error into account during PC?

Hope this is a bit clearer.

Regards,
Enda.

Lets take this example first

I would imagine that yes it would be possible to set the equipment at nominal but run outside the validated range, but if you have a robust characterisation study performed you should be pretty confident of your ranges and your limits.

Have you data to suggest that this happened before?

This seems like a CPP (critical process paramter) so I imagine it has been tested at the characterisation phase fully.

To answer your question it is possible, but if you have a well defined process that has been fully tested then the chances should be minimal.

I recently worked on a validation where 2 of the 5 CQA’s failed the first validation run - this was mainly due to a lack of characterisation been performed.

Hope this helps

Thanks for your input.

Enda.

This is just my opinion and Id like some feedback on this if possible…

The more correct approach in this situation is to ensure that the calibrated range is less than the allowed process operating range.

E.G. for a process requiring 93 +/- 3PSI. (90PSI - 96PSI) The calibration needs to have a tolerance tighter than the process spec. with a calibration tolerance of 93 +/- 4PSI. i.e. 89PSI - 97PSI. You have no certainty that you are within your spec… even if the display says 93.0!

Generally if you have a tolerance of +/-4 on the process you need a calibration tolerance of +/- 1… i.e. a 4 to 1 rule on the tolerance

also the validation should check the calibration tolerance is less than the process tolerance somewhere at IQ. although final process tolerance may not be known at IQ, estimates will be need to be known so that cailbration can be performed to allow a process optimisation to take place.

I hope I haven’t misunderstood the problem and I hope Ive added something to the understanding.

[quote=chris.obrien]This is just my opinion and Id like some feedback on this if possible…

The more correct approach in this situation is to ensure that the calibrated range is less than the allowed process operating range.

E.G. for a process requiring 93 +/- 3PSI. (90PSI - 96PSI) The calibration needs to have a tolerance tighter than the process spec. with a calibration tolerance of 93 +/- 4PSI. i.e. 89PSI - 97PSI. You have no certainty that you are within your spec… even if the display says 93.0!

Generally if you have a tolerance of +/-4 on the process you need a calibration tolerance of +/- 1… i.e. a 4 to 1 rule on the tolerance

also the validation should check the calibration tolerance is less than the process tolerance somewhere at IQ. although final process tolerance may not be known at IQ, estimates will be need to be known so that cailbration can be performed to allow a process optimisation to take place.

I hope I haven’t misunderstood the problem and I hope Ive added something to the understanding.[/quote]

Chris had it correct on all accounts. Ideally, the equipment user requirement doc will specify the equipment capability, i.e. calibration tolerance, based on process requirements.

Thanks for your comments everyone.

Brian, is it not the case that process requirements are established subsequent to equipment development?