No problem, sir, hopefully I could help. I failed to mention that I've
discovered some bugs in the PostgreSQL 8.4 XML implementation that forced me
to take pause and ultimately forego XML with Postgres. I haven't looked at 9
yet, but considering the current lack of interest and/or disdain so many
h
Hi,
While hashing is certainly a good idea, you really should consider some
issues well before you get to that point. Trust me, this could save you some
headaches. First, though you're probably already aware, two XML documents
can be the same document, but with very different literal representatio
You can use a hash index for this. It's drawback is that it is not
yet WAL enabled and if your DB crashes you will need to rebuild the
index to fix the corruption. It works well(only) with equality
searches. If it is a scenario where you must have WAL, use a
function index based on the hash of the
If the performance against an index doesn't cut it, we would be forced
to choose just such an implementation, but if pg can do it straight up
that would be less work for us. A good thing, to be sure.
On 11/30/2010 10:50 AM, jose wrote:
> Why don't you use some type of hash like md5 for indexing ?
Why don't you use some type of hash like md5 for indexing ?
2010/11/30 Rob Sargent :
> Were we to create a table which included a text field for a small block
> of xml (100-1000 chars worth), would an index on that field be useful
> against exact match queries?
>
> We're wondering if a criterion s
Were we to create a table which included a text field for a small block
of xml (100-1000 chars worth), would an index on that field be useful
against exact match queries?
We're wondering if a criterion such as "where 'a string expected to be
of size range 100 to 500' = tabelWithStrings.stringSearc