Re: Issue with Running VACUUM on Database with Large Tables - Mailing list pgsql-bugs

From Tom Lane
Subject Re: Issue with Running VACUUM on Database with Large Tables
Date
Msg-id 1843284.1703516005@sss.pgh.pa.us
Whole thread Raw
In response to Issue with Running VACUUM on Database with Large Tables  (Nagaraj Raj <nagaraj.sf@yahoo.com>)
List pgsql-bugs
Nagaraj Raj <nagaraj.sf@yahoo.com> writes:
> While executing a vacuum analyze on our database containing large tables (approximately 200k), I encountered an
issue.If a table gets dropped during the vacuum process, the vacuum job fails at that point with an error message
stating"OID relation is not found" and exits. 

I can't replicate that.  I get either

ERROR:  relation "foo" does not exist

if you specifically name a nonexistent table, or

WARNING:  skipping vacuum of "foo" --- relation no longer exists

if the table existed at the start but doesn't exist by the time
vacuum gets to it.  There may be some code path that results in
the error you cite, but you'll need to provide more detail about
how to duplicate it.

            regards, tom lane



pgsql-bugs by date:

Previous
From: Nagaraj Raj
Date:
Subject: Issue with Running VACUUM on Database with Large Tables
Next
From: Tom Lane
Date:
Subject: Re: BUG #18240: Undefined behaviour in cash_mul_flt8() and friends