Issue with Running VACUUM on Database with Large Tables - Mailing list pgsql-bugs

From Nagaraj Raj
Subject Issue with Running VACUUM on Database with Large Tables
Date
Msg-id 1237927313.5086260.1703506240368@mail.yahoo.com
Whole thread Raw
Responses Re: Issue with Running VACUUM on Database with Large Tables
List pgsql-bugs

Hello,

While executing a vacuum analyze on our database containing large tables (approximately 200k), I encountered an issue. If a table gets dropped during the vacuum process, the vacuum job fails at that point with an error message stating "OID relation is not found" and exits. This behavior seems to interrupt the entire process without providing a warning or handling the error gracefully.

Considering the possibility of dynamic objects within our database design, this abrupt termination makes it challenging to complete the vacuum process successfully. This issue has persisted across multiple versions, including the current version we're using (14.8).

Is this behavior expected or could it possibly be a bug? It would be beneficial to have a mechanism in place to handle such instances, perhaps by providing a notice or warning when encountering dropped tables, allowing the process to skip those tables and continue with the rest of the vacuum analyze.

Your support in resolving this matter is greatly appreciated.


Thanks,

Rj

pgsql-bugs by date:

Previous
From: Michael Paquier
Date:
Subject: Re: BUG #18240: Undefined behaviour in cash_mul_flt8() and friends
Next
From: Tom Lane
Date:
Subject: Re: Issue with Running VACUUM on Database with Large Tables