FYI:
The problem lies with "IMMUTABLE STRICT", which should NOT be there after
the function body.
Now the function runs as expected.
quote_literal() should be used for the strings with escape chars.
sSql := 'INSERT INTO ' || schemaName || '.' || blob_table || '( ' ||
quote_ident('file_ext') || '
Hi there, thank you all for the responses on the problem. The problem was
that I missed ')' for the VALUES (..). Thank you for pointing that out.
Now, another problem is raised: ERROR: INSERT is not allowed in a
non-volatile function.
The language I am using is LANGUAGE 'plpgsql' IMMUTABLE STRI
dear all,
i found an article about using postgre-postgis-arcsde on
http://support.esri.com/index.cfm?fa=knowledgebase.techarticles.articleShow&d=35385
Few days before, i found the exact same article but with additional steps (tha
above article has 5 steps, the one that i found has 7 steps)
I w
Ben,
> I'm aggregating data from shapefiles into a table to cover a larger
> region.
>
> My question is, is there any benefit to be gained by ordering the data
> by some sort of spatial representation, ie centroid when retrieving
> the data? Or does the creation of the GiST index obviate the need
Ben,
> INSERT INTO gis_roads (country, origin, name, roadtype, the_geom)
> SELECT 'KH', 'MU', userid, code, wkb_geometry FROM gis_roadsk order by
> centroid(wkb_geometry);
>
> and got this message back :
>
> ERROR: new row for relation "gis_roads" violates check constraint
> "enforce_geotype_the_
G'day all,
I'm aggregating data from shapefiles into a table to cover a larger
region.
My question is, is there any benefit to be gained by ordering the data
by some sort of spatial representation, ie centroid when retrieving
the data? Or does the creation of the GiST index obviate the ne
G'day all,
I'm merging a number of data sources, and have endured a fair bit of
learning!
The basic data I had was in shapefiles, so I have use the loader
(shp2pgsql), loaded each one for different countries and merged them
into one table :
shp2pgsql -s 4326 /Users/19022662/Geodata//LAO
On Tue, Jan 06, pcr...@pcreso.com wrote:
> I disagree that it is not pertinent to use scale with vector data. I
> think it can be quite relevant:
>
> Much vector data has been digitised from hard copy. Hard copy always
> has a scale, which it is useful to record with the digitised data.
Agreed, b
We're specifically recruiting PostGIS users and developers for talks!
If you have any questions, please feel free to contact me directly.
--
Hello folks,
PGCon 2009 will be held 21-22 May 2009, in Ottawa at the University of
Ottawa. It will be preceded by two days of tutorials on 19-20 May
2009
tla...@gwdg.de wrote:
Fehler = Error: could not load library: »/usr/lib/pgsql/liblwgeom.so«:
libgeos_c.so.1: cannot open shared object file: File or directory not
found.
However, /usr/lib/pgsql/liblwgeom.so exists and libgeos_c.so.1 is in
/usr/local/lib. Postgis configure script has found the pa
One comment,
I disagree that it is not pertinent to use scale with vector data. I think it
can be quite relevant:
Much vector data has been digitised from hard copy. Hard copy always has a
scale, which it is useful to record with the digitised data.
Also, Postgis is frequently used as a data s
Hello,
I installed geos-3.0.3, proj-4.6.1, and postgis-1.3.5 on CentOS 5.2.
Postgresql-Version is 8.1 (which comes with CentOS).
Loading the lwpostgis.sql file gives a lot of errors. So I tried to create
the first function in lwpostgis.sql by hand. I hope the translation from
the german output is
On Tue, Jan 06, erik wrote:
> Unfortunately I don't know how to set a criterion of quality for these
> features that our data providers could respect. Maybe indeed accuracy in
> meters is feasible. Then arises the issue on what margin could be considered
> acceptable.
You can also specify confiden
Thanks for you replies.
Yep, the problem is here that we do perform area calculations on these
polygons, so their precision does have a significant impact.
Unfortunately I don't know how to set a criterion of quality for these
features that our data providers could respect. Maybe indeed accuracy i
It might be helpful to think about intended uses of your data. You mention
different digitization methods - if you're concerned that these will impact
data use, then the methods could go in the metadata or somehow be encoded
with the data. If you digitized pond boundaries from aerial photos versus
On Tue, Jan 06, erik wrote:
> Sorry for my ignorance (I'm an eternal newbie) but is there a way to store
> scale information in PostGIS?
If you are talking about metadata, not out of the box. You can of course
just add a column holding the information in the table.
> This leads to my second quest
Hello,
Sorry for my ignorance (I'm an eternal newbie) but is there a way to store
scale information in PostGIS?
This leads to my second question: is it pertinent to talk about scale with
electronic GIS vector data and is *scale* the best way to express the
precision of the data given the different
By buffering, you mean draw a buffer around a geometry or count things
in a buffer.
Look at
1) ST_Buffer for drawing a buffer --
http://postgis.refractions.net/documentation/manual-svn/ST_Buffer.html
2) ST_DWithin for finding out all geometries that fall in a buffer --
http://postgis.refractio
18 matches
Mail list logo