Inserting large object in Postgresql using jackc/pgx returns "out of memory (SQLSTATE 54000)" #1283
-
I am using jackc/pgx library to insert largeobjects into Postgres. It works fine when the large objects are small. However in one case the large object was measuring almost 1.8 GB in size. As a result when performing the write operation, there was "out of memory (SQLSTATE 54000)" error. Here is the code snippet how I am inserting blobs.
I get an error on this line
How do I prevent the error and successfully import the large object? I read this post Inserting Large Object into Postgresql returns 53200 Out of Memory error which tries to write the bytes in chunks. Is similar possible with jackc/pgx? I did not get any response on the same question on Inserting large object in Postgresql using jackc/pgx returns "out of memory (SQLSTATE 54000)" so asking it here |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
Presumably you could call |
Beta Was this translation helpful? Give feedback.
-
Thanks @jackc . |
Beta Was this translation helpful? Give feedback.
Thanks @jackc .
The obj maintains the current pointer to the end of the previous write.
So now, I read the file in chunks and write a chunk at a time with the same obj object. Repeating it till the EOF for the file is reached.