I am looking for guidance on setting acceptable tolerances for the tests performed in a spectrophotometers OQ procedure.
I will be testing wavelength and photometric accuracy… using certified calibration filters (Hellma/Starna etc…) but I need advice on the acceptance limits I should set in the procedure.
The instrument I am qualifying has a published specification but I am struggling to work out if I need to take into account the errors that are given for the calibration filter’s certified values? and if I do need to adjust my acceptance limits to take account of these errors, how do I do it?
eg Wavelenght Accuracy
Instrument specification +/-0.3nm
Calibration filter error +/- 0.1nm
Is the acceptable tolerance 0.4nm? Or should it just remain +/-0.3nm?
I have read many articles that say the errors in the filters should be small compared to the instrument (and can then be ignored) which makes sense and others that give statistical methods to combine individual experimental errors to give an overall error.
Can anyone please help to clarify how you should set the tolerances in these equipment qualification protocols?
Thanks