Thread: Copy command-Out of memory error
Hi,
I am trying to copy a 13G file to postgres db using the 'copy from' command on PGSQL 8.4. After loading 8 million records I got the below error .
ERROR: out of memory
DETAIL: Cannot enlarge string buffer containing 1073726116 bytes by 65536 more bytes.
CONTEXT: COPY teststorage, line 8479575
DETAIL: Cannot enlarge string buffer containing 1073726116 bytes by 65536 more bytes.
CONTEXT: COPY teststorage, line 8479575
In my table I do not have foreign keys or triggers. I simply want to load the file to database. I was able to use the copy from command with 10000 records , jus to make sure it works.
please advice
Sent via the Samsung Galaxy S™III, an AT&T 4G LTE smartphone
nipuna <nipunajoset@gmail.com> writes: > I am trying to copy a 13G file to postgres db using the 'copy from' command on PGSQL 8.4. After loading 8 million recordsI got the below error . > ERROR: out of memory > DETAIL: Cannot enlarge string buffer containing 1073726116 bytes by 65536 more bytes. > CONTEXT: COPY teststorage, line 8479575 This makes it look like line 8479575 is, all by itself, wider than 1GB. If that's really what your data is, you'll need to find some other representation --- Postgres can't cope with fields wider than 1GB. (And in practice you don't want to be coming anywhere close to that limit anyway.) If you didn't think you had individual rows that wide, maybe this is due to some sort of quoting or escaping problem in your data file. regards, tom lane
Hi Tom,
Thanks for the response.I did retrieve that particular line to see if that record is wider, but its the same as other records. Anymore suggestions..
--
Thanks for the response.I did retrieve that particular line to see if that record is wider, but its the same as other records. Anymore suggestions..
On Tue, Oct 9, 2012 at 2:43 PM, Tom Lane <tgl@sss.pgh.pa.us> wrote:
nipuna <nipunajoset@gmail.com> writes:This makes it look like line 8479575 is, all by itself, wider than 1GB.
> I am trying to copy a 13G file to postgres db using the 'copy from' command on PGSQL 8.4. After loading 8 million records I got the below error .
> ERROR: out of memory
> DETAIL: Cannot enlarge string buffer containing 1073726116 bytes by 65536 more bytes.
> CONTEXT: COPY teststorage, line 8479575
If that's really what your data is, you'll need to find some other
representation --- Postgres can't cope with fields wider than 1GB.
(And in practice you don't want to be coming anywhere close to that
limit anyway.)
If you didn't think you had individual rows that wide, maybe this is
due to some sort of quoting or escaping problem in your data file.
regards, tom lane
--
Regards,
Nipuna
On Tue, Oct 09, 2012 at 03:03:39PM -0400, Nipuna wrote: > Hi Tom, > > Thanks for the response.I did retrieve that particular line to see if that > record is wider, but its the same as other records. Anymore suggestions.. > Did you check for quoting or escaping problems as Tom suggested, as well. Cheers, Ken > On Tue, Oct 9, 2012 at 2:43 PM, Tom Lane <tgl@sss.pgh.pa.us> wrote: > > > nipuna <nipunajoset@gmail.com> writes: > > > I am trying to copy a 13G file to postgres db using the 'copy from' > > command on PGSQL 8.4. After loading 8 million records I got the below error > > . > > > > > ERROR: out of memory > > > > > DETAIL: Cannot enlarge string buffer containing 1073726116 bytes by > > 65536 more bytes. > > > CONTEXT: COPY teststorage, line 8479575 > > > > This makes it look like line 8479575 is, all by itself, wider than 1GB. > > If that's really what your data is, you'll need to find some other > > representation --- Postgres can't cope with fields wider than 1GB. > > (And in practice you don't want to be coming anywhere close to that > > limit anyway.) > > > > If you didn't think you had individual rows that wide, maybe this is > > due to some sort of quoting or escaping problem in your data file. > > > > regards, tom lane > > > > > > -- > Regards, > Nipuna
Yes I did, there seems no problems.
--
On Tue, Oct 9, 2012 at 3:08 PM, ktm@rice.edu <ktm@rice.edu> wrote:
On Tue, Oct 09, 2012 at 03:03:39PM -0400, Nipuna wrote:Did you check for quoting or escaping problems as Tom suggested, as well.
> Hi Tom,
>
> Thanks for the response.I did retrieve that particular line to see if that
> record is wider, but its the same as other records. Anymore suggestions..
>
Cheers,
Ken
> On Tue, Oct 9, 2012 at 2:43 PM, Tom Lane <tgl@sss.pgh.pa.us> wrote:
>
> > nipuna <nipunajoset@gmail.com> writes:
> > > I am trying to copy a 13G file to postgres db using the 'copy from'
> > command on PGSQL 8.4. After loading 8 million records I got the below error
> > .
> >
> > > ERROR: out of memory
> >
> > > DETAIL: Cannot enlarge string buffer containing 1073726116 bytes by
> > 65536 more bytes.
> > > CONTEXT: COPY teststorage, line 8479575
> >
> > This makes it look like line 8479575 is, all by itself, wider than 1GB.
> > If that's really what your data is, you'll need to find some other
> > representation --- Postgres can't cope with fields wider than 1GB.
> > (And in practice you don't want to be coming anywhere close to that
> > limit anyway.)
> >
> > If you didn't think you had individual rows that wide, maybe this is
> > due to some sort of quoting or escaping problem in your data file.
> >
> > regards, tom lane
> >
>
>
>
> --
> Regards,
> Nipuna
--
Regards,
Nipuna
On Tue, Oct 9, 2012 at 12:39 PM, Nipuna <nipunajoset@gmail.com> wrote: > Yes I did, there seems no problems. > > On Tue, Oct 9, 2012 at 3:08 PM, ktm@rice.edu <ktm@rice.edu> wrote: >> >> On Tue, Oct 09, 2012 at 03:03:39PM -0400, Nipuna wrote: >> > Hi Tom, >> > >> > Thanks for the response.I did retrieve that particular line to see if >> > that >> > record is wider, but its the same as other records. Anymore >> > suggestions.. >> > >> >> Did you check for quoting or escaping problems as Tom suggested, as well. >> >> Cheers, >> Ken >> >> > On Tue, Oct 9, 2012 at 2:43 PM, Tom Lane <tgl@sss.pgh.pa.us> wrote: >> > >> > > nipuna <nipunajoset@gmail.com> writes: >> > > > I am trying to copy a 13G file to postgres db using the 'copy from' >> > > command on PGSQL 8.4. After loading 8 million records I got the below >> > > error >> > > . >> > > >> > > > ERROR: out of memory >> > > >> > > > DETAIL: Cannot enlarge string buffer containing 1073726116 bytes by >> > > 65536 more bytes. >> > > > CONTEXT: COPY teststorage, line 8479575 >> > > >> > > This makes it look like line 8479575 is, all by itself, wider than >> > > 1GB. >> > > If that's really what your data is, you'll need to find some other >> > > representation --- Postgres can't cope with fields wider than 1GB. >> > > (And in practice you don't want to be coming anywhere close to that >> > > limit anyway.) >> > > >> > > If you didn't think you had individual rows that wide, maybe this is >> > > due to some sort of quoting or escaping problem in your data file. >> > > >> > > regards, tom lane >> > > >> > >> > >> > >> > -- >> > Regards, >> > Nipuna > > > > > -- > Regards, > Nipuna > > Look out for file corruption. Use vim and/or hex editor. Saw that error on files with NULL bytes, unterminated delimiters, unterminated lines. You should split the file into chunks then load and see which chunk will fail.
Nipuna <nipunajoset@gmail.com> writes: > On Tue, Oct 9, 2012 at 3:08 PM, ktm@rice.edu <ktm@rice.edu> wrote: >> Did you check for quoting or escaping problems as Tom suggested, as well. > Yes I did, there seems no problems. Well, you've clearly got *some* problem, because COPY is evidently thinking that the logical line continues across many physical lines. If you're using CSV format then this would be easily explained by a missing closing quote in some field. In traditional Postgres COPY format it's harder to have a runaway line. But you've not given us any details about the command or the data, so we really can't do more than guess. regards, tom lane