Validation requirements when moving a validated software from Windows XP to Windows 7

I am currently working on change plan to capture the migration of software from windows XP to windows 7. Software was initailly validated on windows XP. But the company is moving the Operating system to Windows 7. It is a file server software.What is the right approach so that the system maintains the compliant state without losing data integrity? What dosumentation needs to be revised?

Hello @Pratima

What documentation currently exists from the initial validation?


Currently, we have GxP/Risk assessment, 21 CFR assessement, validation plan, IQ/OQ Plan, IQ/OQ report, PQ plan, PQ report, validation summary report in total. In short, the complete set of validation deliverable. I dont think I need to execute all the test cases again or revise all the documents for the current operating system update?
Appreciate you response.

There’s no hard and fast rule here, the level of testing required will be dictated by the results of the Risk Assessment on this update…that should detail to some degree the risk involved.

Is this a general Microsoft update or something more substantial?

@yodon perhaps you have some experience here too?

Indeed, the testing should be based on your assessment of risk (due to the
change). That said, going from XP to 7 is a pretty big leap so I wouldn’t
take it too casually. Is there any supporting software (e.g., database
back-ends, etc.)? If so, you may need to re-visit the IQ and ensure that
all the supporting software still is expected to play well with the upgrade
(these may also require an upgrade and/or may have use limits with the new

Does this also involve data migration? If so, you may want to consider
some effort to ensure complete and accurate migration.

Sounds like you have a good foundation (congratulations!). Consider
something along the lines of an FMEA to identify what might go wrong and
then you can structure any testing (or justify limited testing) on that.

Thank you guys. Appreciate your comments on the above.

@gokeeffe This would be general microsoft update.

@yodon As I mentioned it is file transfer system, it does not connect to regular database like SQL, Oracle as such. Instead the backend is open source flat file where data is stored in structured folders which is accessed by the application only.
However, the risk is that the system is build on very old technology delphi and has serious issues in terms of memory usage. Do you see this as a risk point? I am still to analyze how compatible the technology is in terms of OS migration.
Also, data is maintained on server so there wont be any data migration involved.

Thanks for suggesting FMEA. I would love to explore that option regards to limited testing.

I think you are close to creating the FMEA in that you have already identified the fact that Memory could be a potential issue here.

Other potential failure modes:

  1. Is the open source back-end compatible with this upgrade?
  2. Is Delphi compatible with this upgrade?
  3. The memory usage issue.

I think identifying all of the potential risks and using a FMEA to analyze the potential effects is the way to go.

You could then map the outcomes to testing in a protocol.

Hope that helps somewhat,

Yes, since you bring it up, old technology / memory usage would be a risk
point you should consider!

FMEA is just one tool; a Fault Tree Analysis is also a way to organize

Sounds like you’re on the right path. Good luck.