On Tue, May 14, 2013 at 9:48 AM, kristian kvilekval <k...@cs.ucsb.edu> wrote:
>
> I was really hoping to see be able to store several ten's of millions XML
> documents in postgres, but I would also like to use Xquery to retrieve
> results.  Back in 2010 there was some mailing list discussion about using
> integrating the xquery processor of zorba into postgres.  I was trying to
> gauge the interest level and if anybody had attempted it.  As you say, JSON
> has stolen all the thunder, and in fact the Zorba people have worked on
> JSONiq (an xquery processor for JSON data structures), but our project uses
> XML.    We like the flexibility you get with Xquery and I am looking around
> for some solutions.

Advise refining your requirements so that you can search your xml with
xpath and then do transformations in sql -- this is more powerful than
it looks on the surface.  If you must use xquery, the shortest path to
get there is possibly through a backend pl (say perl or python) so
that you can tap into fancy xml support there.  If not xpath, then
xslt -- xquey is just a mess IMNSHO and should be avoided, especially
for new programming efforts.

JSON is superior in just about every way -- less verbose, more regular
syntax, faster parsing.  JSON is also very tightly integrated to
postgres (especially with coming 9.3) so that serialization to/from
database data is automagical and essentially eliminates the need for
the querying/construction steps that make dealing with xml such a bear
(especially when serializing from database).  So much so that if I was
tasked with complex xml transformations in the database I'd maybe
consider converting them to json, doing the transformation, then back
to xml again.

merlin


-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to