Re: Best practice to load a huge table from ORACLE to PG - Mailing list pgsql-performance

From Dimitri Fontaine
Subject Re: Best practice to load a huge table from ORACLE to PG
Date
Msg-id 200804280949.40101.dfontaine@hi-media.com
Whole thread Raw
In response to Re: Best practice to load a huge table from ORACLE to PG  (Greg Smith <gsmith@gregsmith.com>)
List pgsql-performance
Hi,

Le dimanche 27 avril 2008, Greg Smith a écrit :
> than SQL*PLUS.  Then on the PostgreSQL side, you could run multiple COPY
> sessions importing at once to read this data all back in, because COPY
> will bottleneck at the CPU level before the disks will if you've got
> reasonable storage hardware.

Latest pgloader version has been made to handle this exact case, so if you
want to take this route, please consider pgloader 2.3.0:
  http://pgloader.projects.postgresql.org/#_parallel_loading
  http://pgfoundry.org/projects/pgloader/

Another good reason to consider using pgloader is when the datafile contains
erroneous input lines and you don't want the COPY transaction to abort. Those
error lines will get rejected out by pgloader while the correct ones will get
COPYied in.

Regards,
--
dim

Attachment

pgsql-performance by date:

Previous
From: Vlad Arkhipov
Date:
Subject: Simple JOIN problem
Next
From: Gregory Stark
Date:
Subject: Re: [pgsql-advocacy] Benchmarks WAS: Sun Talks about MySQL