Hello, this is a problem on an extreme situation.

When parser encounters very long tokens, it returns an error
message which should be mysterious for users.

> $ perl -e "print \"select '\" . \"x\"x(512*1024*1024) . \"'\"" | psql postgres
> ERROR:  invalid memory alloc request size 1073741824

Since parser repeats repalloc doubling in size starting from 1024
bytes, it practically fails for tokens longer than 512MiB. Most
tokens doesn't touch the limit but text literal may do.

If we don't assume this as a problem, the following discussion
would be useless.

========
1. Supplying context meessage or something would be needed anyway.

  > ERROR:  invalid memory alloc request size 1073741824
  > DETAIL: maybe encoutered too long token in parsing query


2. Edit the documentaion, or modify the behavior?

 http://www.postgresql.org/docs/devel/static/datatype-character.html 

 >  In any case, the longest possible character string that can be
 >  stored is about 1 GB.

 Maximum possible text length would be about 1GB but it is far
 shorter when parsing text literal.

 I thought it is better to avoid 512MiB limit than adding
 discription about it to documentation.  It could easily be done
 using MaxAllocSize (I don't like this but enlargeStringInfo
 already does this..), or start from (1024 - 1) and repalloc with
 ((last_size + 1) * 2 - 1) would do assuming MaxAllocSize is 2^n
 (this looks unwantedly complex), or simply starts from 1023
 works enough.

 On the other hand, any of these fixes caused "out of memory" of
 receive buffer on my environment. So I think such fix might be
 useless.

Thoughts? Sugestions? or is this totally useless?


regards,

-- 
Kyotaro Horiguchi
NTT Open Source Software Center



-- 
Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-hackers

Reply via email to