Re: Dump/restore with bad data and large objects - Mailing list pgsql-general

From John T. Dow
Subject Re: Dump/restore with bad data and large objects
Date
Msg-id 200808251658.m7PGwX45088951@web2.nidhog.com
Whole thread Raw
In response to Re: Dump/restore with bad data and large objects  (Tom Lane <tgl@sss.pgh.pa.us>)
List pgsql-general
Tom

My mistake in not realizing that 8.1 and later can dump large objects in the plain text format. I guess when searching
foranswers to a problem, the posted information doesn't always specify the version. So, sorry about that. 

But the plain text format still has serious problems in that the generated file is large for byte arrays and large
objects,there is no ability to selectively restore a table, and bad data still isn't detected until you try to restore. 

Or did I miss something else?

John

PS: Yes, I know you can pipe the output from pg_dumpall into an archiver, but it's my understanding that the binary
datais output in an inefficient format so even if zipped, the resulting file would be significantly larger than the
customformat. 



On Mon, 25 Aug 2008 12:14:41 -0400, Tom Lane wrote:

>"John T. Dow" <john@johntdow.com> writes:
>> If you dump in plain text format, you can at least inspect the dumped
>> data and fix it manually or with iconv. But the plain text
>> format doesn't support large objects (again, not nice).
>
>It does in 8.1 and later ...
>
>> Also, neither of these methods gets information such as the roles,
>
>Use pg_dumpall.
>
>            regards, tom lane



pgsql-general by date:

Previous
From: Ivan Sergio Borgonovo
Date:
Subject: Re: playing with catalog tables limits? dangers? was: seq bug 2073 and time machine
Next
From: "Scott Marlowe"
Date:
Subject: Re: SERIAL datatype