Thread: Issue with Running VACUUM on Database with Large Tables
Hello,
While executing a vacuum analyze on our database containing large tables (approximately 200k), I encountered an issue. If a table gets dropped during the vacuum process, the vacuum job fails at that point with an error message stating "OID relation is not found" and exits. This behavior seems to interrupt the entire process without providing a warning or handling the error gracefully.
Considering the possibility of dynamic objects within our database design, this abrupt termination makes it challenging to complete the vacuum process successfully. This issue has persisted across multiple versions, including the current version we're using (14.8).
Is this behavior expected or could it possibly be a bug? It would be beneficial to have a mechanism in place to handle such instances, perhaps by providing a notice or warning when encountering dropped tables, allowing the process to skip those tables and continue with the rest of the vacuum analyze.
Your support in resolving this matter is greatly appreciated.
Thanks,
Rj
Nagaraj Raj <nagaraj.sf@yahoo.com> writes: > While executing a vacuum analyze on our database containing large tables (approximately 200k), I encountered an issue.If a table gets dropped during the vacuum process, the vacuum job fails at that point with an error message stating"OID relation is not found" and exits. I can't replicate that. I get either ERROR: relation "foo" does not exist if you specifically name a nonexistent table, or WARNING: skipping vacuum of "foo" --- relation no longer exists if the table existed at the start but doesn't exist by the time vacuum gets to it. There may be some code path that results in the error you cite, but you'll need to provide more detail about how to duplicate it. regards, tom lane