Re: [HACKERS] large object regression tests

2006-09-09 Thread Lamar Owen
On Tuesday 05 September 2006 02:59, Jeremy Drake wrote:
> I am considering, and I think that in order to get a real test of the
> large objects, I would need to load data into a large object which would
> be sufficient to be loaded into more than one block (large object blocks
> were 1 or 2K IIRC) so that the block boundary case could be tested.  Is
> there any precedent on where to grab such a large chunk of data from?  I
> was thinking about using an excerpt from a public domain text such as Moby
> Dick, but on second thought binary data may be better to test things with.

A 5 or 6 megapixel JPEG image.  Maybe a photograph of an elephant.
-- 
Lamar Owen
Director of Information Technology
Pisgah Astronomical Research Institute
1 PARI Drive
Rosman, NC  28772
(828)862-5554
www.pari.edu

---(end of broadcast)---
TIP 9: In versions below 8.0, the planner will ignore your desire to
   choose an index scan if your joining column's datatypes do not
   match


Re: [HACKERS] large object regression tests

2006-09-08 Thread Markus Schaber

Hi, Jeremy,
Jeremy Drake wrote:

> I am considering, and I think that in order to get a real test of the
> large objects, I would need to load data into a large object which would
> be sufficient to be loaded into more than one block (large object blocks
> were 1 or 2K IIRC) so that the block boundary case could be tested.  Is
> there any precedent on where to grab such a large chunk of data from?

You could generate such data on the fly, as part of the test scripts.

E. G. a blob of zero bytes, blob of 0xff bytes, a blob of pseudo random
data...

Markus
-- 
Markus Schaber | Logical Tracking&Tracing International AG
Dipl. Inf. | Software Development GIS

Fight against software patents in EU! www.ffii.org www.nosoftwarepatents.org



signature.asc
Description: OpenPGP digital signature


Re: [HACKERS] large object regression tests

2006-09-07 Thread Tom Lane
Jeremy Drake <[EMAIL PROTECTED]> writes:
> I noticed when I was working on a patch quite a while back that there are
> no regression tests for large object support.

Yeah, this is bad :-(

> I am considering, and I think that in order to get a real test of the
> large objects, I would need to load data into a large object which would
> be sufficient to be loaded into more than one block (large object blocks
> were 1 or 2K IIRC) so that the block boundary case could be tested.  Is
> there any precedent on where to grab such a large chunk of data from?

There's always plain old junk data, eg, repeat('xyzzy', 10).
I doubt that Moby Dick would expose any unexpected bugs ...

> ... I find that it is necessary to stash certain values across
> statements (large object ids, large object 'handles'), and so far I am
> using a temporary table to store these.  Is this reasonable, or is there a
> cleaner way to do that?

I think it's supposed to be possible to use psql variables for that;
if you can manage to test psql variables as well as large objects,
that'd be a double bonus.

regards, tom lane

---(end of broadcast)---
TIP 2: Don't 'kill -9' the postmaster