UV/Vis Spectrophotometer Qualification

I am looking for guidance on setting acceptable tolerances for the tests performed in a spectrophotometers OQ procedure.

I will be testing wavelength and photometric accuracy… using certified calibration filters (Hellma/Starna etc…) but I need advice on the acceptance limits I should set in the procedure.

The instrument I am qualifying has a published specification but I am struggling to work out if I need to take into account the errors that are given for the calibration filter’s certified values? and if I do need to adjust my acceptance limits to take account of these errors, how do I do it?

eg Wavelenght Accuracy

Instrument specification +/-0.3nm
Calibration filter error +/- 0.1nm

Is the acceptable tolerance 0.4nm? Or should it just remain +/-0.3nm?

I have read many articles that say the errors in the filters should be small compared to the instrument (and can then be ignored) which makes sense and others that give statistical methods to combine individual experimental errors to give an overall error.

Can anyone please help to clarify how you should set the tolerances in these equipment qualification protocols?

Thanks

Take a look at the EP, this lists limits for these tests.

I’d like to resurrect this topic. I am trying to put together an SOP for the calibration of spectrophotometers from various manufacturers. What we would like to do is standardize our tolerances to encompass ALL the calibration tests we perform on these intstruments. Are there any industry guides related to this subject, or is this another one of those things where everyone does it differently?

The EP details specs and methods for doing this. Simply incorporate into your SOP. You decide on frequency given the usage and criticality of data. i’d go monthly initially until you have sufficient data to jusitfy moving to 3 monthly for example.