On Fri, Nov 2, 2018 at 10:53 AM Tom Lane <t...@sss.pgh.pa.us> wrote:
> Merlin Moncure <mmonc...@gmail.com> writes:
> > On Wed, Oct 31, 2018 at 10:23 AM Andres Freund <and...@anarazel.de> wrote:
> >> It's entirely unacceptable afaict. Besides the whole "exposing
> >> internals" issue, it's also at least not endianess safe, depends on the
> >> local alignment requirements (which differ both between platforms and
> >> 32/64 bit), numeric's internal encoding and probably more.
>
> > Binary format consuming applications already have to deal with these
> > kinds of issues. We already expose internal structures in the other
> > functions -- not sure why jsonb is held to a different standard.
>
> I don't think it's being held to a different standard at all.  Even for
> data as simple as integers/floats, we convert to uniform endianness on the
> wire.  Moreover, we do not expose the exact bits for anything more complex
> than machine floats.  Numeric, for instance, gets disassembled into fields
> rather than showing the exact header format (let alone several different
> header formats, as actually found on disk).
>
> Andres' point about alignment is a pretty good one as well, if it applies
> here --- I don't recall just what internal alignment requirements jsonb
> has.  We have not historically expected clients to have to deal with that.

I see your (and Andres') point; the binary wire format ought to lay on
top of the basic contracts established by other types.  It can be
binary; just not a straight memcpy out of the server.  The array and
composite type serializers should give some inspiration there on
serialization.   I'll still stand other point I made though; I'd
really want to see some benchmarks demonstrating benefit over
competing approaches that work over the current formats.  That should
frame the argument as to whether this is a good idea.

merlin

Reply via email to