Thread: BUG #4527: Prepare of large multirow insert fails without error
The following bug has been logged online: Bug reference: 4527 Logged by: Vincent Kessler Email address: vincent.kessler@quantec-networks.de PostgreSQL version: 8.3.4 Operating system: Debian Linux 2.6.18-6-686-bigmem Description: Prepare of large multirow insert fails without error Details: Hi, i am trying to do large multirow inserts using PQsendPrepare. I have not found a limit in the number of parameters or the size of the querystring, so i assume memory is the limit. When executing the PQsendPrepare function using a querystring of about 100kb in size and about 10000 parameters the function returns after timeout. A tcpdump shows a "parse" message with a length of 100kb but the transfer stops after roughly 30kb. The server log shows: LOG: incomplete message from client LOG: unexpected EOF on client connection A pcap file for analysis using tcpdump or wireshark is available here: http://www.quantec-networks.de/downloads/dump.pcap I have no quick way to upgrade to 8.3.5 but according to the changelog it should not be relevant in this case. What can i do to narrow down the problem? Any help is greatly apreciated. Regards, Vincent
"Vincent Kessler" <vincent.kessler@quantec-networks.de> writes: > i am trying to do large multirow inserts using PQsendPrepare. I have not > found a limit in the number of parameters or the size of the querystring, so > i assume memory is the limit. > When executing the PQsendPrepare function using a querystring of about 100kb > in size and about 10000 parameters the function returns after timeout. A > tcpdump shows a "parse" message with a length of 100kb but the transfer > stops after roughly 30kb. With such a large statement it's unlikely that the PQsendPrepare call would have been able to push all the data out immediately. Since it's intended to not block the application, it would return with some data still unsent. You need to call PQflush periodically until the data is all transmitted, if you want to run in nonblocking mode. > The server log shows: > LOG: incomplete message from client > LOG: unexpected EOF on client connection Although this explanation doesn't say why the client apparently dropped the connection. I think you need to show us a complete example of what you're doing, if the above hint isn't sufficient. regards, tom lane
Tom Lane schrieb, Am 13.11.2008 16:28: > "Vincent Kessler" <vincent.kessler@quantec-networks.de> writes: >> i am trying to do large multirow inserts using PQsendPrepare. I have not >> found a limit in the number of parameters or the size of the querystring, so >> i assume memory is the limit. >> When executing the PQsendPrepare function using a querystring of about 100kb >> in size and about 10000 parameters the function returns after timeout. A >> tcpdump shows a "parse" message with a length of 100kb but the transfer >> stops after roughly 30kb. > > With such a large statement it's unlikely that the PQsendPrepare call > would have been able to push all the data out immediately. Since it's > intended to not block the application, it would return with some data > still unsent. You need to call PQflush periodically until the data is > all transmitted, if you want to run in nonblocking mode. > Thank you very much, that was exactly the problem. Everything works perfectly now. Regards, Vincent