I am having a hang condition every time I try to retrieve a large
records (bytea) data from a table
The OS is a 5.11 snv_134 i86pc i386 i86pc Solaris with 4GB memory
running Postgresql 8.4.3 with a standard postgresql.conf file (nothing
has been changed)
I have the following table called doc_tabl
You may do a backup of this table. Then with ultraedit search your
documents and remove them.
2011/7/5, jtke...@verizon.net :
> I am having a hang condition every time I try to retrieve a large
> records (bytea) data from a table
> The OS is a 5.11 snv_134 i86pc i386 i86pc Solaris with 4GB memory
jtke...@verizon.net, 05.07.2011 18:44:
A while ago the some developers inserted several records with a
document (stored in doc_Data) that was around 400 - 450 MB each. Now
when you do a select * (all) from this table you get a hang and the
system becomes unresponsive.
What application/program i
he could use smth like this to know the size like:
SELECT count(*),CASE WHEN length(doc_data)<5000 THEN '<=50 MB' WHEN
length(doc_data)<1 THEN '<=100 MB' ELSE '>100MB' END from doc_table
GROUP by 2;
and then based on the above, to do finer queries to find large data.
However, i don
PostgreSQL has to accumulate all the rows of a query before returning the
result set to the client. It is probably spooling those several 400-450 Mb
docs, plus all the other attributes, to a temporary file prior to sending the
results back. If you have just three document stored in the databas