Re: Backup Large Tables - Mailing list pgsql-general

From Michael Nolan
Subject Re: Backup Large Tables
Date
Msg-id 4abad0eb0609212006p5d9c037bhf2110ce3a8c3f0f1@mail.gmail.com
Whole thread Raw
In response to Backup Large Tables  ("Charles Ambrose" <jamjam360@gmail.com>)
Responses Re: Backup Large Tables
List pgsql-general
I have a table with over 6 million rows in it that I do a dump on every night.  It takes less than 2 minutes to create a file that is around 650 MB.

Are you maybe dumping this file in 'insert' mode?
--
Mike Nolan

On 9/21/06, Charles Ambrose <jamjam360@gmail.com> wrote:
Hi!

I have a fairly large database tables (say an average of  3Million to 4Million records).  Using the pg_dump utility takes forever to dump the database tables. As an alternative, I have created a program that gets all the data from the table and then put it into a text file. I was also unsuccessfull in this alternative to dump the database.



pgsql-general by date:

Previous
From: "Charles Ambrose"
Date:
Subject: Backup Large Tables
Next
From: Christopher Browne
Date:
Subject: Re: postgresql rising