Re: Need help doing a CSV import - Mailing list pgsql-general

From Tim Landscheidt
Subject Re: Need help doing a CSV import
Date
Msg-id m3y6degpye.fsf@passepartout.tim-landscheidt.de
Whole thread Raw
In response to Need help doing a CSV import  (tony@exquisiteimages.com)
Responses Re: Need help doing a CSV import
List pgsql-general
Craig Ringer <craig@postnewspapers.com.au> wrote:

>> I am in the process of moving a FoxPro based system to PostgreSQL.

>> We have several tables that have memo fields which contain carriage
>> returns and line feeds that I need to preserve. I thought if I converted
>> these into the appropriate \r and \n codes that they would be imported as
>> carriage returns and line feeds, but instead they are stored in the
>> database as \r and \n.

> PostgreSQL doesn't process escapes in CSV import mode.

> You can reformat the data into the non-csv COPY format,
> which WILL process escapes. Or you can post-process it after
> import to expand them. Unfortunately PostgreSQL doesn't
> offer an option to process escapes when "CSV" mode COPY is
> requested.

> I posted a little Python script that reads CSV data and
> spits out COPY-friendly output a few days ago. It should be
> trivially adaptable to your needs, you'd just need to change
> the input dialect options. See the archives for the script.

Another option is a small Perl script or something similar
that connects to both the FoxPro and the PostgreSQL database
and transfers the data with parameterized "INSERT". The ad-
vantage of this is that you have tight control of charsets,
date formats, EOL conventions & Co. and do not have to won-
der whether this and that file is in this and that stage of
the conversion process, the disadvantage is obviously that
you lose any speed benefit of bulk "COPY".

Tim

pgsql-general by date:

Previous
From: Greg Smith
Date:
Subject: Re: PG_DUMP very slow because of STDOUT ??
Next
From: "Ankit Kamal"
Date:
Subject: error "CDT FATAL: invalid frontend message type 69"