Re: [gdal-dev] How do I add a projection to proj 8?

2024-04-13 Thread Stephen Woodbridge via gdal-dev

On 4/13/2024 1:26 PM, Javier Jimenez Shaw wrote:


On Sat, 13 Apr 2024 at 17:35, Stephen Woodbridge via gdal-dev 
 wrote:


Thanks, this is NOT the standard Web Mercator projection. I am
aware of EPSG:90013 and EPSG:3857. This projection is used with
HYCOM data that I have extracted into geotif files so that I can
accurately project that onto EPSG:3857. It took some fiddling with
the values to get to overlay visually correctly. HYCOM data is
weird in that it uses two different projections based on if the
data is above or below some latitude.

I found something like that in the Internet. But I was not sure it was 
the right one https://polar.ncep.noaa.gov/global/about/ It didn't 
specify the projections, just a short description as Arctic bi-polar 
patch north of 47N, and Mercator south of it.


Yes this what is happening with the HYCOM data. The projection 
definition in PROJ4 worked quite well for my case of web mapping in 
mapserver.


I do not know if you can specify a projection "in parts" respect to 
parallel 47. Maybe in WKT2.


Where did you found those parameters for the datum and projection? 
They are quite strange.


For the ellipsoid there are a few already with a similar radius:

SELECT * from ellipsoid where semi_major_axis BETWEEN 6371000 and 6371010

EPSG7035Sphere  
PROJEARTH   6371000.0   EPSG9001
6371000.0   1
EPSG7048GRS 1980 Authalic Sphere
PROJEARTH   6371007.0   EPSG9001
6371007.0   0
ESRI 	107047 	Sphere_GRS_1980_Mean_Radius 	Sphere with mean radius 
based on GRS80 	PROJ 	EARTH 	6371008.7714 	EPSG 	9001 	0.0 	

0


And datums

SELECT * from geodetic_datum where ellipsoid_code in (7035, 7048, 107047)

EPSG6035Not specified (based on Authalic Sphere)
EPSG7035EPSG8901




1
EPSG6047Not specified (based on GRS 1980 Authalic Sphere)   
EPSG7048EPSG8901




1
ESRI 	106047 	D_Sphere_GRS_1980_Mean_Radius 	GRS 1980 Mean Radius 
Sphere 	ESRI 	107047 	EPSG 	8901 	





0


and geographic crs

SELECT * from geodetic_crs where datum_code in (6035, 6047, 106047)

EPSG4035Unknown datum based upon the Authalic Sphere
geographic 2D   EPSG6402EPSG6035
1
EPSG4047Unspecified datum based upon the GRS 1980 Authalic Sphere   
geographic 2D   EPSG6422EPSG6047
1
ESRI104047  GCS_Sphere_GRS_1980_Mean_Radius 
geographic 2D   EPSG6422ESRI106047  
0

See that EPSG ones are deprecated. (surprisingly the ellipsoid 
EPSG:7048 is not deprecated, but the datum that uses it is deprecated).


On 4/13/2024 4:19 AM, Javier Jimenez Shaw via gdal-dev wrote:

If what you need is really EPSG:3857, yes, use it.

However I have seen strange parameters on your projection. The
radius of the sphere is the "average" 3671 km, and you set a
false easting and northing of just 4.4 km. Is that trying to
correct the radius of the sphere? I do not know why you need that.

Bas, are they really equivalent?

In proj you can convert to WKT1 (see that I added +type=crs):

projinfo "+proj=merc +a=6371001 +b=6371001 +lat_ts=0.0 +lon_0=0.0
+x_0=-4448 +y_0=-4448 +k=1.0 +units=m +over +nadgrids=@null
+no_defs  +type=crs" -o wkt1_gdal


Ok I get this adding by +type=crs but how do I add it to the proj
database so I can access it referencing it by something like
EPSG:90014?


Is a WKT string enough?

PROJCS["unknown",
    GEOGCS["unknown",
        DATUM["unknown using nadgrids=@null",
            SPHEROID["unknown",6371001,0]],
        PRIMEM["Greenwich",0,
            AUTHORITY["EPSG","8901"]],
        UNIT["degree",0.0174532925199433,
            AUTHORITY["EPSG","9122"]]],
    PROJECTION["Mercator_1SP"],
    PARAMETER["central_meridian",0],
    PARAMETER["scale_factor",1],
    PARAMETER["false_easting",-4448],
    PARAMETER["false_northing",-4448],
    UNIT["metre",1,
        AUTHORITY["EPSG","9001"]],
    AXIS["Easting",EAST],
    AXIS["Northing",NORTH],
    EXTENSION["PROJ4","+proj=merc +a=6371001 +b=6371001 +lat_ts=0.0 
+lon_0=0.0 +x_0=-4448 +y_0=-4448 +k=1.0 +units=m +over +nadgrids=@null 
+no_defs"]]


(generated with the projinfo line from prev email) Using a geographic 
CRS as described above will make it nicer, and probably more compatible.


You can use it in QGIS for instance. I am not sure how does it behave 
in a GeoTIFF, as it has some special tags. You can try to generate the 
geotiff

Re: [gdal-dev] How do I add a projection to proj 8?

2024-04-13 Thread Stephen Woodbridge via gdal-dev
Thanks, this is NOT the standard Web Mercator projection. I am aware of 
EPSG:90013 and EPSG:3857. This projection is used with HYCOM data that I 
have extracted into geotif files so that I can accurately project that 
onto EPSG:3857. It took some fiddling with the values to get to overlay 
visually correctly. HYCOM data is weird in that it uses two different 
projections based on if the data is above or below some latitude.


On 4/13/2024 4:19 AM, Javier Jimenez Shaw via gdal-dev wrote:

If what you need is really EPSG:3857, yes, use it.

However I have seen strange parameters on your projection. The radius 
of the sphere is the "average" 3671 km, and you set a false easting 
and northing of just 4.4 km. Is that trying to correct the radius of 
the sphere? I do not know why you need that.


Bas, are they really equivalent?

In proj you can convert to WKT1 (see that I added +type=crs):

projinfo "+proj=merc +a=6371001 +b=6371001 +lat_ts=0.0 +lon_0=0.0 
+x_0=-4448 +y_0=-4448 +k=1.0 +units=m +over +nadgrids=@null +no_defs 
 +type=crs" -o wkt1_gdal


Ok I get this adding by +type=crs but how do I add it to the proj 
database so I can access it referencing it by something like EPSG:90014?


Thanks,
  -Steve


On Sat, 13 Apr 2024 at 06:17, Sebastiaan Couwenberg via gdal-dev 
 wrote:


On 4/12/24 11:24 PM, Stephen Woodbridge via gdal-dev wrote:
> and was able to access it in gdal, mapserver, postgis, etc with
> "EPSG:900914"

I used to do that too, but switched to EPSG:3857 its non-deprecated
equivalent. I would recommend that instead of trying to keep using a
non-standard projection.

Kind Regards,

Bas

-- 
  GPG Key ID: 4096R/6750F10AE88D4AF1

Fingerprint: 8182 DE41 7056 408D 6146  50D1 6750 F10A E88D 4AF1

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev



--
This email has been checked for viruses by Avast antivirus software.
www.avast.com___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] How do I add a projection to proj 8?

2024-04-12 Thread Stephen Woodbridge via gdal-dev

Hi all,

I've been gone for a while, but got called back to update a site I built 
and need to move from proj4 to proj 8 on Ubuntu 22.04. In the past I 
just added the following to /usr/share/proj/epsg


# HYCOM Mercator projection
<900914> +proj=merc +a=6371001 +b=6371001 +lat_ts=0.0 +lon_0=0.0 
+x_0=-4448 +y_0=-4448 +k=1.0 +units=m +over +nadgrids=@null +no_defs  <>


and was able to access it in gdal, mapserver, postgis, etc with 
"EPSG:900914"


How does one do that with the new system?

Thanks,
  -Steve W

--
This email has been checked for viruses by Avast antivirus software.
www.avast.com___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] Considering drivers removal ?

2021-01-27 Thread Stephen Woodbridge
I think a 4th option is a hybrid approach of moving to a more modular plug-in 
architecture that allows the core more flexibility to evolve at the same time 
by moving to a more plug-in driver allows for more independent development, 
testing and release lets the community participation. This does stop the core 
from maintaining some of the drivers also. I don’t see value in maintaining 
obsolete drivers or drivers that only a few people use if it costs us a lot to 
maintain them. 

That said figuring out the long term funding for the project core is critical. 

Unfortunately I’m only in a position to offer an opinion and not much more so 
feel free to ignore it. 

Best regards,
-Steve W

Sent from my iPhone

> On Jan 27, 2021, at 12:28 PM, Howard Butler  wrote:
> 
> 
>> 
>> but from my point of view supporting a lot of formats is part of GDAL's 
>> success,
> 
> GDAL is a 22 year old software project. It's not just that GDAL supports lots 
> of formats. It is also that the code supporting all of those formats is 
> meticulously maintained, and it maintains *good* support for all of those 
> formats. The bulk of that meticulous maintenance has not been evenly carried 
> by the individuals, organizations, and companies that have been enjoying the 
> benefits from it. GDAL maintenance as currently happen(ed) is unsustainable 
> in all of the ways you read about in handwringing think pieces on the 
> internet about open source software projects.
> 
>> so maybe the real focus should be on implementing a plugin mechanism that 
>> would allow driver development independently from core GDAL.
> 
> As I see it, the project has three potential futures:
> 
> 1) Continue the current architectural and niche trajectory. A one-stop-shop 
> for geospatial formats that is conveniently distributed in all relevant 
> platforms.
> 2) Split GDAL/OGR core from the drivers so that each can evolve and be 
> maintained at their own pace according to the attention they can attract.
> 3) Let GDAL rot as-is with low wattage community maintenance and exist as a 
> zombie gut pile of useful code that organizations continue to pull from and 
> incorporate into their own software. 
> 
> I think we as a community want status quo – #1 – all of the goodness that 
> GDAL provides by a complete implementation of the geospatial format universe 
> all in one spot. As should be becoming clear by these threads, this scenario 
> is not likely to continue due to the three jobs problem [1] I described 
> earlier in the thread. Our options to maintain this status quo is for the 
> community to provide the revenue stream for someone to do just the maintainer 
> job, effectively split the maintenance activities, or find another Even that 
> wants three jobs :)
> 
> The second scenario has the potential to make it easier to share the 
> maintenance burden, but it cleaves off what many see as GDAL's best feature – 
> universality – by making support for specific formats be a packager's or a 
> user's burden. It would limit the GDAL platform leverage that vendors 
> currently get by injecting support for their proprietary SDKs for the project 
> to carry, and the impact and station of GDAL is likely to be reduced by this 
> approach. Maybe that could be a good thing.
> 
> The third scenario is a common one. Organizations with the need and resources 
> to internally spend will continue to maintain GDAL in their (closed) 
> codebases. The software-based interoperability that GDAL provides the 
> industry will diminish, and the existing tree will reach a kind of stasis 
> with open source distributors until the bugs accumulate in frequency and 
> scope to cause it to get dropped. 
> 
> 
> Howard
> 
> [1] https://lists.osgeo.org/pipermail/gdal-dev/2021-January/053302.html
> ___
> gdal-dev mailing list
> gdal-dev@lists.osgeo.org
> https://lists.osgeo.org/mailman/listinfo/gdal-dev
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] Driver maintenance - long-term solution ?

2021-01-15 Thread Stephen Woodbridge
So reading through this thread, my cynical side agrees with the "this 
project is dying from lack of funding, ..." approach, but I'm not sure 
that works for the long run as people get tired of hear the "sky is 
falling" and ignore it over time.


A different approach could be something along the lines of what AARP 
does, to get it members to email congress representative to lobby for 
support. OK, this might sound crazy and it would require some 
coordination but might work like this:


1) we get government agencies that use GDAL and more broadly OSGeo push 
up the chain of command the need to support opensource projects and the 
current issues in doing so
2) we put a support web page the allows community members and public to 
email a form letter to their congress people explaining that a large 
number of government agencies (probably need a list of them) need and 
use opensource software but are not able to support the development or 
maintenance of. This is a critical issue around the US being a leader in 
opensource development. If this software becomes unavailable or 
unsupported that it puts these agencies at risk, blah, blah, blah


Apologies to reads in other countries that disagree with the statements 
above, this is an appeal to US congress, so its slanted in their 
direction. No offense is meant here.


Likewise, this could be done in other regions/countries that rely on 
this software.


Commercial companies spend money on lobbying and sales calls so maybe we 
need to find a strategy that works for opensource.


Anyway, it's an idea, maybe it's not workable because we don't have 
funding to put something like this in place or people don't think it is 
of value. I don't think this should be done at the GDAL level, but maybe 
at OSGeo and maybe in coordination with other opensource organizations. 
So just putting another idea out there.


-Steve W
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] Considering drivers removal ?

2021-01-12 Thread Stephen Woodbridge

On 1/12/2021 5:36 PM, Even Rouault wrote:

On mardi 12 janvier 2021 12:56:13 CET Frank Warmerdam wrote:

On Tue, Jan 12, 2021 at 12:38 PM Howard Butler  wrote:

The only question that matters here is "Who is going to maintain it?" and
if the answer to that is "no one", it should be removed. There doesn't
need
to be any meetings because the only criteria that matters is if someone is
willing to maintain it. We should provide the list of drivers and assign
the GitHub handles that step forward to be responsible for each. If
obscure
government one-offs formats have an audience of downtrodden government
users forced to use them, they need to put their handle in and take
ownership. They then need to find the time, money, or attention to carry
things forward.

Howard / Even,

I'd be willing to commit to maintaining some of the archaic drivers that
meet the conditions I mentioned (buildable, testable in core build).  If
Even would like I can provide a sublist of those he proposed i'd be willing
to be responsible for.

NTv1: this is the perfect example of a driver of absolutely no use in 2021.
Unless I'm wrong, there was only one single public dataset for that format,
ntv1_can.dat, and it is now available as GeoTIFF in
https://github.com/OSGeo/PROJ-data/blob/master/ca_nrc/ca_nrc_ntv1_can.tif

My current plan is:

- for BPG, E00GRID, EPSILON, IGNFHeightASCIIGrid, ISG, Aeronav FAA, BNA, HTF,
OpenAir, SEG-P1, SEG-Y, SUA, X-Plane, move *now* driver code, documentation
and tests to https://github.com/OSGeo/gdal-extra-drivers which is a slightly
improved version of a cemetery repository, since it includes a build script to
create a plugin. I have no plan to maintain that repository after that initial
move (that means I won't merge pull requests unless someone else steps up for
the role) and it will likely break in the future. I'd wish we would agree to
move more drivers there. And probably most future drivers for esoteric formats
should go there.
Those drivers are ones I've authored, that received no significant
contribution from anyone else AFAIR, no-one paid development for and I suspect
are close to be unused. So hopefully no one should have bad feelings with them
going away. It was a bad taste of mine to have put them in GDAL to start with.
Why did I picked up the extra repository after all ? A tiny fraction of them
might be useful like ISG or IGNFHeightASCIIGrid in some contexts (to create
grids for PROJ), but definitely not for general purpose. So as far as I'm
concerned, I'll go through the extra step of building the extra repository or
a subset of it if I've an occasional need for them.

- proceed as I mentionned initially for other drivers I listed and no-one
steps up to maintain, with the variation of moving the code to gdal-extra-
drivers instead of just removing them (but potentially not including a build
recipee for them in the build script, if that proves to be too complex).

The issue with esoteric/legacy drivers is not that much maintenance of the
actual code of the drivers, in the sense of dealing with bug reports,
questions, etc. (pretty sure they are none for the ones I listed). Most of
them must work reasonably well and be feature complete, and most
vulnerabilities have now been fixed. My concern is that this legacy code has
indirect costs on other GDAL developers and users. The psychological cost I
mentionned. Let's say someone want to turn on higher warning levels, and that
this breaks in tens of drivers. Would he have to ping every maintainer and
wait for them to address the issue ? Or maybe he will just give up. Similarly
for breaking changes in the driver API. As Sean mentionned, this is probably a
serious obstacle to growing up the core development team.


Even


Even,

Just want to say this sounds like good plan to me, not that my input 
means a lot. I also want to say Thank You! for all your hard work 
supporting this and other projects, but answering my questions through 
the years. I've had a lot of roles in my career in open source and 
industry and can appreciate the difficult balance between compatibility 
with legacy code and the need to break free of it to move forward. It's 
hard and I never enjoyed having to make those decisions, but you have my 
respect and support whatever you decide.


Thank You! again for your efforts and support!

-Steve W
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] Considering drivers removal ?

2021-01-10 Thread Stephen Woodbridge

Even,

This makes a lot of sense to me. How would you handle this in Python?
Would it make sense to create a GDAL-removed repository and move stuff 
into it just so it is available if someone wants it. This would not be 
supported or updated by GDAL just making it available if someone 
wants/needs to fork it, something else like this?


-Steve W

On 1/10/2021 6:02 PM, Even Rouault wrote:

Hi,

It's not spring yet, but I'm in a mood lately of axing useless things, and we
probably have tons of candidate for that in GDAL, especially in drivers.
I was going to just axe the DB2 driver
(https://github.com/OSGeo/gdal/pull/3366) but the issue is more general.

Any idea how we can know what is used and what isn't ? A "call-home"
functionality where we would track driver usage would only be acceptable if
people enable it and have network connectivity, so we won't probably get lots
of feedback. Having a spreadsheet with the driver list and asking people to
fill it would probably also receive little feedback. So the idea I had was to
do something like the following in the Open() method of a candidate for
removal:

GDALDataset* FooDriver::Open(  )
{
if( !Identify(poOpenInfo) )
   return nullptr;

if( !CPLTestBool(CPLGetConfigOption("GDAL_ENABLE_DRIVER_FOO", "NO") )
{
CPLError(CE_Failure, CPLE_AppDefined,
 "Driver FOO is considered for removal in GDAL 3.5. You are invited "
 "to convert any dataset in that format to another more common one ."
 "If you need this driver in future GDAL versions, create a ticket at "
 "https://github.com/OSGeo/gdal (look first for an existing one first) to "
 "explain how critical it is for you (but the GDAL project may still "
 "remove it), and to enable it now, set the GDAL_ENABLE_DRIVER_FOO "
 "configuration option / environment variable to YES");
return nullptr;
 }
 ...
}

That is, when we detect a file to be handled by the driver, emit the above
error message and do not open the dataset, unless the user defines the
environment variable.
Similarly in the Create()/CreateCopy() methods.
If we ship this in 3.3, with a 3.5 milestone for removal, this would offer a
feedback period of one year / 2 feature versions.

Here's my own list of candidates for retirement (probably over-conservative).
Mostly based on gut feeling. None of them are particularly bad citizens, but I
have no indication that they are still used, which doesn't mean they aren't.

* Raster side:
BPG
DB2Raster
DOQ1
DOQ2
E00GRID
Epsilon
FujiBAS
GS7BG
GSAG
IDA
JDEM
JPEG2000 (Jasper): JP2OpenJPEG is a better replacement
JPEGLS
LAN
MFF
MG4Lidar ?
NDF
NTv1
SDTS Raster
SGI
XPM
ZMap

* Vector side:
AERONAVFAA
ESRI ArcObjects
ARCGEN
BNA
Cloudant
CouchDB
DB2
DODS
FMEObjects Gateway
Geomedia MDB
GMT ASCII Vectors
GTM
HTF
INGRES
MongoDB (the old one, superseded by MongoDBv3)
OpenAIR
REC
SDTS
SUA
SVG
TIGER
WALK


Anything you'd add / remove ?

What is not obvious is what would be the criterion for keeping a driver: 1,
10, 100 users asking for the driver to be kept ?
If a GDAL developer contributing to the overall good of the project needs the
preservation of a driver to be able to justify its continued involvement, I'd
tend to think it to be enough to keep it.


Even



___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] Contour Line Thinning

2021-01-01 Thread Stephen Woodbridge

On 1/1/2021 9:47 PM, Richard Greenwood wrote:
On Fri, Jan 1, 2021 at 2:36 PM Stephen Woodbridge 
mailto:stephenwoodbridg...@gmail.com>> 
wrote:


Hi all,

I'm contouring bathemetry data using gdal_contour and it works really
great. The problem I have is that when depth falls off rapidly
like at
the continental shelf or into a canyon, I get too many contour lines
that all bunch up. If I change the contour step size to fix this,
then
the flatter areas don't get enough lines.

I wonder if anyone has any ideas on someway to thin these lines or
some
way to do adaptive contouring based on maybe something like
scanning the
image first to build a masks that represent these rapid changes in
depth
and then change the contour levels in these masked areas.

I currently contour into a postgis database, the render them using
mapserver into a tile cache since they are static once they are
computed.

I would be interested in any ideas you might have on how to tackle
this
problem.

-Steve W


Hey Steve,

Interesting problem and this isn't an answer, just my opinion. I live 
and play in a mountainous area and frequently use USGS topo maps with 
contour intervals of 20, 40 and 80 feet. Each map's contour interval 
was chosen with criteria like yours - flatter land needs smaller 
contour intervals, but steeper land can become too cluttered with a 
small contour interval. But as a map user it drives me crazy when I 
stitch together adjoining maps with different intervals and try to get 
a sense of the landscape. Like this for example 
<https://greenwoodmap.com/tetonwy/mapserver/map#zcr=7.279815109511815/2448564.5062904786/1516712.6778719614/0=DRG,Roads,ownership> 
where 20 foot contours adjoin 80 foot. The western half of the map is 
much steeper than the eastern, but that's not obvious from a quick 
look. I'd just let the bunched up contours tell the reader that hey, 
it's really steep here!


Best regards,
Rich

--
Richard W. Greenwood, PLS
www.greenwoodmap.com <http://www.greenwoodmap.com>


Hi Rich,

Yeah, I get your point. And the engineer in me agrees but users of the 
map have complained so I have to at least look into the issue.


One thought I had that might work because I'm dealing with ocean bottom 
contours is to do something like:


a) take all contours above X
b) take all contours below Y
c) take every Nth contour between X and Y

This would probably work OK for the drop off on the continental shelf at 
least for the East coast, I'd have to look at other areas since this is 
a global map, but 98% of the users are on the East coast currently but 
that is expanding.


Anyway, it is an interesting problem, I'd like to find a simple solution 
that I can build into the postGIS database where I have all the contour 
lines stored. Or find a solution that handles the generation of the 
contour lines with some kind of adaptive thinning. My guess is that it 
will not be easy to do it at the generation level, so I'll probably only 
be able to do the thinning during the rendering of the tiles.


Thank you for your thoughts on this,
-Steve
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Contour Line Thinning

2021-01-01 Thread Stephen Woodbridge

Hi all,

I'm contouring bathemetry data using gdal_contour and it works really 
great. The problem I have is that when depth falls off rapidly like at 
the continental shelf or into a canyon, I get too many contour lines 
that all bunch up. If I change the contour step size to fix this, then 
the flatter areas don't get enough lines.


I wonder if anyone has any ideas on someway to thin these lines or some 
way to do adaptive contouring based on maybe something like scanning the 
image first to build a masks that represent these rapid changes in depth 
and then change the contour levels in these masked areas.


I currently contour into a postgis database, the render them using 
mapserver into a tile cache since they are static once they are computed.


I would be interested in any ideas you might have on how to tackle this 
problem.


-Steve W
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] Problem writing with ODS and XLSX drivers

2020-09-12 Thread Stephen Woodbridge
I'm wondering if there are unicode characters that are not properly 
formated when output to the xlsx or ods files. If the CSV export takes a 
different export and import path that handles correct conversion, maybe 
that is not the case for the binary files.
I would try to locate 1-2 records from the shp file that cause this 
problem then submit that to a bug report.


You should be able to apply a where clause to the export to select only 
a subset of the records when you export.


-Steve W

On 9/12/2020 1:21 PM, Hernán De Angelis wrote:
That was a very good piece of advice. Unfortunately that is apparently 
not the problem. The CSV imports cleanly in Libreoffice, with the same 
number of rows (minus title row) as features are in the shp. No trace 
of misplaced linefeed or carriage return characters. I will have to 
keep looking for solutions. Thanks anyway!


/H.

On 2020-09-12 18:56, Stephen Woodbridge wrote:
Try exporting to CSV and check the that the records do not have any 
embeded  or  chars. One simple way to do this is to compare 
the record count of the shp file to the line count of the CSV file. 
Try importing the CSV file into EXCEL  or Libreoffice and see if it 
reports a problem. This may give you a better idea of what the 
problem is.


-Steve W

On 9/12/2020 11:20 AM, Hernán De Angelis wrote:

Hi everyone

I am experiencing an odd and stubborn problem when trying to export 
attribute tables from both shp and spatialite to ods or xlsx. 
ogr2ogr finishes work silently with no reported errors but the 
generated output files cannot be opened with Libreoffice, which says 
the file is corrupted and is even unable to repair it.


The input files are good from what I can see using QGIS and OGR but 
it is clear that something isn't right in my installation, my 
procedure or somewhere else. I am missing something? Have other 
users experienced this?


I am using GDAL/OGR 3.1.2 in openSUSE Tumbleweed, compiled against 
expat 2.2.9-1.12 (devel packages installed too, of course).


Any hint is appreciated!

Hernán


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Problem writing with ODS and XLSX drivers

2020-09-12 Thread Stephen Woodbridge
Try exporting to CSV and check the that the records do not have any 
embeded  or  chars. One simple way to do this is to compare the 
record count of the shp file to the line count of the CSV file. Try 
importing the CSV file into EXCEL  or Libreoffice and see if it reports 
a problem. This may give you a better idea of what the problem is.


-Steve W

On 9/12/2020 11:20 AM, Hernán De Angelis wrote:

Hi everyone

I am experiencing an odd and stubborn problem when trying to export 
attribute tables from both shp and spatialite to ods or xlsx. ogr2ogr 
finishes work silently with no reported errors but the generated 
output files cannot be opened with Libreoffice, which says the file is 
corrupted and is even unable to repair it.


The input files are good from what I can see using QGIS and OGR but it 
is clear that something isn't right in my installation, my procedure 
or somewhere else. I am missing something? Have other users 
experienced this?


I am using GDAL/OGR 3.1.2 in openSUSE Tumbleweed, compiled against 
expat 2.2.9-1.12 (devel packages installed too, of course).


Any hint is appreciated!

Hernán


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] I have a problem getting higher resolution in a VRT file

2020-08-20 Thread Stephen Woodbridge

Jukka,

Thanks those are great threads and very educational.
I solved my problem. I was using upsample="bilinear" rather than 
upsampling="bilinear" and fixing this gave the expected results. But if 
not for my stupid typo, I would not have had a chance to read those 
threads and google others regarding upsampling.


Thanks for the assist,
-Steve

On 8/20/2020 10:13 AM, Rahkonen Jukka (MML) wrote:

Hi,

Resampling is science that would be nice to understand. Unfortunately I don't 
but I think that this thread is worth reading 
osgeo-org.1560.x6.nabble.com/gdal-dev-downsampling-geotiff-with-a-low-pass-filter-td5385890.html.
 In this gis.SE question cubic spline was considered good 
https://gis.stackexchange.com/questions/30627/smoothing-reinterpolating-raster-with-gdal.

-Jukka-

-Alkuperäinen viesti-
Lähettäjä: Stephen Woodbridge 
Lähetetty: torstai 20. elokuuta 2020 16.58
Vastaanottaja: Rahkonen Jukka (MML) 
Aihe: Re: [gdal-dev] I have a problem getting higher resolution in a VRT file

Hi Jukka,

Any thoughts on what would be a better way to upsample and image using GDAL?

I assumed, maybe incorrectly, that if I’m up sampling between two pixels with 
values of say 1 and 2 that a new pixel 50% between them would get a value of 
1.5 and at 25% would get a value 1.25 etc. using linear interpolation and 
bilinear would also account for rows above and below the current row.

Even, I specified resampling=“bilinear” in the vrt file is this ignored?

Thanks, Steve

Sent from my iPhone


On Aug 20, 2020, at 8:23 AM, jratike80  
wrote:

Even Rouault-2 wrote

...
By default, VRT uses nearest resampling. You must specify something else.
Look for "A resampling attribute can be specified" in
https://gdal.org/drivers/raster/vrt.html
...

--

Actually in the provided .vrt file I can see 

But isn't bilinear unsuitable for upsampling and some more blurring
method might work better?

-Jukka Rahkonen-



--
Sent from: http://osgeo-org.1560.x6.nabble.com/GDAL-Dev-f3742093.html
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] I have a problem getting higher resolution in a VRT file

2020-08-19 Thread Stephen Woodbridge

Hi all,

I've been puzzling over a problem with trying to get a higher resolution 
images using a VRT file so low res data looks smoother in mapserver.


This is my source tif image:

gdalinfo  /maps/wms/data/HYCOM/HYCOM_today_mlt.tif
Driver: GTiff/GeoTIFF
Files: /maps/wms/data/HYCOM/HYCOM_today_mlt.tif
Size is 4500, 4251
Coordinate System is:
GEOGCS["WGS 84",
    DATUM["WGS_1984",
    SPHEROID["WGS 84",6378137,298.257223563,
    AUTHORITY["EPSG","7030"]],
    AUTHORITY["EPSG","6326"]],
    PRIMEM["Greenwich",0],
    UNIT["degree",0.0174532925199433],
    AUTHORITY["EPSG","4326"]]
Origin = (-180.000,90.000)
Pixel Size = (0.080,-0.040)
Metadata:
  AREA_OR_POINT=Area
  [snip more metadata]
Image Structure Metadata:
  INTERLEAVE=BAND
Corner Coordinates:
Upper Left  (-180.000,  90.000) (180d 0' 0.00"W, 90d 0' 0.00"N)
Lower Left  (-180.000, -80.040) (180d 0' 0.00"W, 80d 2'24.00"S)
Upper Right ( 180.000,  90.000) (180d 0' 0.00"E, 90d 0' 0.00"N)
Lower Right ( 180.000, -80.040) (180d 0' 0.00"E, 80d 2'24.00"S)
Center  (   0.000,   4.980) (  0d 0' 0.01"E, 4d58'48.00"N)
Band 1 Block=4500x1 Type=Float32, ColorInterp=Gray
  NoData Value=1.2676506002282294e+30
  Overviews: 2250x2126, 1125x1063, 563x532, 282x266, 141x133, 71x67


I generate a VRT file as part of my processing and add a color table and 
increase the resolution. It looks like this:


gdalinfo  /maps/wms/data/HYCOM/HYCOM_today_mlt.vrt
Driver: VRT/Virtual Raster
Files: /maps/wms/data/HYCOM/HYCOM_today_mlt.vrt
   /maps/wms/data/HYCOM/HYCOM_today_mlt.tif
Size is 72000, 68016
Coordinate System is:
GEOGCS["WGS 84",
    DATUM["WGS_1984",
    SPHEROID["WGS 84",6378137,298.257223563,
    AUTHORITY["EPSG","7030"]],
    AUTHORITY["EPSG","6326"]],
    PRIMEM["Greenwich",0],
    UNIT["degree",0.0174532925199433],
    AUTHORITY["EPSG","4326"]]
Origin = (-180.000,90.000)
Pixel Size = (0.005,-0.0025000)
Corner Coordinates:
Upper Left  (-180.000,  90.000) (180d 0' 0.00"W, 90d 0' 0.00"N)
Lower Left  (-180.000, -80.040) (180d 0' 0.00"W, 80d 2'24.00"S)
Upper Right ( 180.000,  90.000) (180d 0' 0.00"E, 90d 0' 0.00"N)
Lower Right ( 180.000, -80.040) (180d 0' 0.00"E, 80d 2'24.00"S)
Center  (   0.000,   4.980) (  0d 0' 0.01"E, 4d58'48.00"N)
Band 1 Block=128x128 Type=Byte, ColorInterp=Palette
  Overviews: 36000x34016, 18000x17008, 9008x8512, 4512x4256, 2256x2128, 
1136x1072

  Color Table (RGB with 256 entries)
[snip color table]


And in my mapfile I have a layer defined like:

    LAYER
    NAME "mlt"
    STATUS ON
    TYPE RASTER
    PROJECTION "init=epsg:4326" END
    DATA "/maps/wms/data/HYCOM/HYCOM_today_mlt.vrt"
    PROCESSING "NODATA=1.2676506002282294e+30"
    PROCESSING "SCALE=0,250"
    END

This all seems like it should be working correctly, but in my OL app, 
the resolution is not changing. I would expect the with the higher 
resolution I would see smoother color transitions as I zoom in on the 
image and that the pixel blocks would appear smaller. Unfortunately, I'm 
not seeing a difference.


So I assume I've done something wrong somewhere, but I'm not see what :(

Any Ideas?

Thanks,
  -Steve

Oh and the VRT file looks like this:


  GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 
84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0],UNIT["degree",0.0174532925199433],AUTHORITY["EPSG","4326"]]

   -180.0 ,0.005 , 0 , 90.0 , 0 , -0.0025 
  
  
  
    Palette
    
  
  [snip color table entries]
  
    
    
  relativeToVRT="1">HYCOM_today_mlt.tif

  1
  DataType="Float32" BlockXSize="4500" BlockYSize="1" />

  
  
    
  


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Dealing with "default" TOWGS84 values

2020-03-31 Thread Stephen Woodbridge

Even,

Would it make sense to add an option to GDAL like -ignore-bound-towgs84 
or -use-bound-towgs84 depending on what the default behavior is set to?


This would allow the user to choose which is best for their data.

-Steve W

On 3/31/2020 10:21 AM, Even Rouault wrote:


Hi,

I wanted to get opinions on a topic related to TOWGS84 7-parameter 
transforms. In the GDAL 1.x/2.x & PROJ 4.x/5.x era, having TOWGS84 
values attached to CRS definitions was critical to get correct datum 
transformation.


Now, in the GDAL 3 + PROJ 6.x/7.x era, there are more an annoyance 
than anything else, since they can prevent more accurate 
transformations to be used.


Some data formats such as GeoTIFF files written by GDAL 1.x/2.x encode 
the TOWGS84 value of the CRS definition that was built when importing 
the EPSG dataset. E.g 
TOWGS84[446.448,-125.157,542.06,0.15,0.247,0.842,-20.489] for 
EPSG:27700 OSGB 1936 / British National Grid


In GDAL 3, when such file is read, a BoundCRS object is returned, that 
is a CRS that has a base CRS (potentially with a EPSG code attached. 
e.g EPSG:27700) + the definition of its transformation to WGS84 (which 
corresponds to the transformation with larger area of use). Software 
like QGIS will then use this BoundCRS (since that's what GDAL returns 
for the dataset CRS) when reprojecting to a target CRS, which will 
cause the TOWGS84 parameters from the file to be used (PROJ >= 6 will 
honour the TOWGS84 parameters of a BoundCRS when transforming 
from/into it), rather than potentially more accurate transformations 
(typically grid based transformations) from the base CRS to the target 
CRS.


Whether this behaviour is a bug or a feature, and where this bug 
belongs to, has been a debate in a number of tickets:


https://github.com/OSGeo/gdal/issues/2219

https://github.com/qgis/QGIS/issues/34993

I wanted to have broader opinions regarding if we should do something 
about that, and what.


One possibility would be for GDAL to ignore the TOWGS84 value 
associated with a base CRS if we detect that this TOWGS84 value is the 
default one that was written by GDAL 1.x/2.x for this base CRS, and 
thus to return just the base CRS, letting PROJ figure out all 
potential transformations when transforming to a target CRS. This 
could potentially be the default behaviour, with a configuration 
option that could be set to opt for the current behaviour (that is 
return the BoundCRS). This would have to be implemented per driver 
(logic would be similar). I guess GeoTIFF, GPKG, Spatialite could be 
candidates.


The potential downside of this is if the user relied on those exact 
default TOWGS84 values to be used when reprojecting to WGS84 (or when 
using WGS84 as the pivot)


I'd note that GDAL 3, in the OGRCoordinateTransformation code, 
implements partially this logic, but only in the transformation code 
(not impacting the CRS actually returned by the dataset), and with 
limitations. That is when transforming from/into a BoundCRS whose 
which has a unique Helmert transformation to WGS84 that matches the 
transformation of the BoundCRS, it uses by default only the BaseCRS 
(can be disabled with OSR_CT_USE_DEFAULT_EPSG_TOWGS84=YES). But this 
is for example not sufficient for EPSG:27700, because it has several 
Helmert transformations. So as suggested in 
https://github.com/OSGeo/gdal/issues/2219, a broader fix would be for 
the PROJ database to have a GDAL2 authority that would store the 
TOWGS84 parameters historically used by GDAL 2.


Even

--

Spatialys - Geospatial professional services

http://www.spatialys.com


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] Solved: Re: Having a problem getting VRT ComplexSource to Scale

2020-01-23 Thread Stephen Woodbridge
Never mind! I just checked the histogram and the data is getting scaled, 
I forgot that I was also scaling it in my mapfile so the results  where 
getting masked.


-Steve

On 1/23/2020 8:20 PM, Stephen Woodbridge wrote:

Hi all,

I'm having trouble getting my VRT file to scale the source data into 
my ColorTable.

I've been looking at this https://gdal.org/drivers/raster/vrt.html

The source GeoTiff has:

Band 1 Block=4500x1 Type=Float32, ColorInterp=Gray
  Minimum=1.082, Maximum=46.322, Mean=34.181, StdDev=2.049
  NoData Value=-3
  Overviews: 2250x2126, 1125x1063, 563x532, 282x266, 141x133, 71x67
  Metadata:
    STATISTICS_MAXIMUM=46.321998596191
    STATISTICS_MEAN=34.181107349018
    STATISTICS_MINIMUM=1.0819988250732
    STATISTICS_STDDEV=2.0492649377546

My goal is to scale the source 0-50 into 10-240 in the ColorTable.


The VRT looks like:


   -180.0 ,0.08 , 0 , 90.0 , 0 , -0.04 
  
  
  
    Palette
    
  
  
...
  
    
    
  relativeToVRT="1">HYCOM_tomorrow_salinity_0.tif

  1
  0
  1
  0
  
  
    
  


50 - 0 = 50
240 - 10 = 230
230 / 50 = 4.6 = ScaleRatio
10 = ScaleRatio

I've tried a lot of other value trying to play with the scaling.

I also tried using:

0

0
50
10
240 but that doesn't appear to work either. Using: 
GDAL 2.2.3, released 2017/11/20 on Unbuntu 18.04 My conclusion is that 
I'm not understanding the meaning of these fields or these are meant 
for something other than my goal. Any help would be appreciated. 
Thanks, -Steve




___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] Having a problem getting VRT ComplexSource to Scale

2020-01-23 Thread Stephen Woodbridge

Hi all,

I'm having trouble getting my VRT file to scale the source data into my 
ColorTable.

I've been looking at this https://gdal.org/drivers/raster/vrt.html

The source GeoTiff has:

Band 1 Block=4500x1 Type=Float32, ColorInterp=Gray
  Minimum=1.082, Maximum=46.322, Mean=34.181, StdDev=2.049
  NoData Value=-3
  Overviews: 2250x2126, 1125x1063, 563x532, 282x266, 141x133, 71x67
  Metadata:
    STATISTICS_MAXIMUM=46.321998596191
    STATISTICS_MEAN=34.181107349018
    STATISTICS_MINIMUM=1.0819988250732
    STATISTICS_STDDEV=2.0492649377546

My goal is to scale the source 0-50 into 10-240 in the ColorTable.


The VRT looks like:


   -180.0 ,0.08 , 0 , 90.0 , 0 , -0.04 
  
  
  
    Palette
    
  
  
...
  
    
    
  relativeToVRT="1">HYCOM_tomorrow_salinity_0.tif

  1
  0
  1
  0
  
  
    
  


50 - 0 = 50
240 - 10 = 230
230 / 50 = 4.6 = ScaleRatio
10 = ScaleRatio

I've tried a lot of other value trying to play with the scaling.

I also tried using:

0

0
50
10
240 but that doesn't appear to work either. Using: GDAL 2.2.3, 
released 2017/11/20 on Unbuntu 18.04 My conclusion is that I'm not 
understanding the meaning of these fields or these are meant for 
something other than my goal. Any help would be appreciated. Thanks, -Steve


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] ogr2ogr PDF

2019-11-14 Thread Stephen Woodbridge

Paul,

Your issue might be one of scale. You have specified units in px try 
changing it to pt. (units (g, px, pt, mm, cm, in))


See: https://gdal.org/user/ogr_feature_style.html#ogr-feature-style

You can also try named pens like:

Here is the current list of OGR pen ids (this could grow over time):

 *

   ogr-pen-0: solid (the default when no id is provided)

 *

   ogr-pen-1: null pen (invisible)

 *

   ogr-pen-2: dash

 *

   ogr-pen-3: short-dash

 *

   ogr-pen-4: long-dash

 *

   ogr-pen-5: dot line

 *

   ogr-pen-6: dash-dot line

 *

   ogr-pen-7: dash-dot-dot line

 *

   ogr-pen-8: alternate-line (sets every other pixel)

System-specific ids are very likely to be meaningful only to that 
specific system that created them. The ids should start with the 
system’s name, followed by a dash (-), followed by whatever information 
is meaningful to that system (a number, a name, a filename, etc.).

e.g. “mapinfo-5”, or “mysoft-lines.sym-123”, or “othersystems-funnyline”

System-specific ids are allowed in order to prevent loss of information 
when dealing with data from systems that store line patterns in external 
files or that have their own pre-defined set of line styles (for 
instance, to do a MapInfo MIF to TAB translation without any loss.)


Examples:
PEN(c:#00FF00,id:”ogr-pen-0”) - simple solid line
PEN(c:#00FF00,id:”mapinfo-5,ogr -pen-7”) - corresponds to MapInfo’s Pen 
#5, and a system that can’t understand MapInfo pens falls back on the 
default “ogr-pen-7” pen (dot-dot line).


-Steve W

On 11/14/2019 2:18 AM, paul.m...@lfv.se wrote:

Thanks for responding Jukka!
I'm getting real irritated of myself for not getting such a simple thing to 
work with such grate tools.
I have now tried with:
PEN(c:#FF00FF,w:3px,p:"3px 3px")
PEN(c:#FF00FF,w:3px,p:3px 3px)
PEN(p:"3px 3px",c:#FF00FF,w:3px)
PEN(p:3px 3px,c:#FF00FF,w:3px)
PEN(c:#FF00FF, p:"3px 3px",w:3px)
PEN(c:#FF00FF, p:3px 3px,w:3px)
But I'm getting the same solid line. Width and color are ok.
/Paul

-Ursprungligt meddelande-
Från: gdal-dev [mailto:gdal-dev-boun...@lists.osgeo.org] För jratike80
Skickat: den 13 november 2019 16:35
Till: gdal-dev@lists.osgeo.org
Ämne: Re: [gdal-dev] ogr2ogr PDF

Hi,

What is supported is documented in a table in
https://gdal.org/drivers/raster/pdf.html. For PEN only these options are
supported: color (c); width (w); dash pattern (p).

So why, you know, nobody has implemented it yet. Meanwhile you can do
something with dash pattern, or edit the PDF with some PDF editor.

-Jukka Rahkonen-


paul.malm wrote

Hi,
I’m now setting attribute OGR_STYLE in my postGis db for a testlayer
(line)
I’m now setting attribute OGR_STYLE in my postgis db for a test layer
(LineString):
… SET \"OGR_STYLE\"= 'PEN(c:#FF,id:ogr-pen-4,w:1px)'
Does anyone know why the pen symbol (id) is ignored when exporting the
PDF, it’s always a solid line?
Color and width is ok.
I’ve tried to draw a ogr-symbol together with a point layer and it worked
like a charm.
Kind regards,
Paul





--
Sent from: http://osgeo-org.1560.x6.nabble.com/GDAL-Dev-f3742093.html
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Question on gdal_contour and limiting the range of contours generated.

2019-10-31 Thread Stephen Woodbridge
Thanks I have not seen the raster common option file thing. I’ll look at it. 

Sent from my iPhone

> On Oct 31, 2019, at 2:11 PM, jratike80  
> wrote:
> 
> Hi,
> 
> One more suggestion, have you tried the -fl option "Name one or more “fixed
> levels” to extract"? Requires  some writing but only once if you save the
> list into an optfile https://gdal.org/programs/raster_common_options.html. I
> haven't tried the option with hundreds of fixed levels, though.
> 
> -Jukka-
> 
> 
> Jukka,
> 
> Thanks! After trying shapefile and exceeding 2GB, I switched to postgres 
> and deleted the elevation as you suggested.
> 
> I'm working with a global relief model the includes both elevation and 
> bathymetry depth over the whole world at about 15 arc-sec/pixel. I 
> generates a lot of contour lines between 8,000 and -10,000 meters, I was 
> think it would be handy to gdal_dem had a few more options to control 
> the range and direction of contour generation.
> 
> I also have played with gdal_calc to create an elevation and bathymetry 
> only files.
> 
> Thanks, I think I have workarounds for the moment.
> 
> -Steve W
> 
> 
> 
> 
> 
> --
> Sent from: http://osgeo-org.1560.x6.nabble.com/GDAL-Dev-f3742093.html
> ___
> gdal-dev mailing list
> gdal-dev@lists.osgeo.org
> https://lists.osgeo.org/mailman/listinfo/gdal-dev
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Question on gdal_contour and limiting the range of contours generated.

2019-10-31 Thread Stephen Woodbridge

Jukka,

Thanks! After trying shapefile and exceeding 2GB, I switched to postgres 
and deleted the elevation as you suggested.


I'm working with a global relief model the includes both elevation and 
bathymetry depth over the whole world at about 15 arc-sec/pixel. I 
generates a lot of contour lines between 8,000 and -10,000 meters, I was 
think it would be handy to gdal_dem had a few more options to control 
the range and direction of contour generation.


I also have played with gdal_calc to create an elevation and bathymetry 
only files.


Thanks, I think I have workarounds for the moment.

-Steve W

On 10/31/2019 4:38 AM, jratike80 wrote:

Second thought, perhaps I would just delete unnecessary contour lines
afterwards instead of flattening the DEM.

gdal_contour -f gpkg -a elev -off -121.92 -i 1.524
bathy-value/crm_vol2.nc.tif bathy-contours/crm_vol2.nc-5.gpkg

ogrinfo -sql "delete from crm_vol2.nc-5 where elev>0"
bathy-contours/crm_vol2.nc-5.gpkg

ogrinfo -sql "VACUUM" bathy-contours/crm_vol2.nc-5.gpkg

-Jukka-



jratike80 wrote

Hi,

I believe I would use gdal_calc https://gdal.org/programs/gdal_calc.html
and
make the terrain above the sea level flat before creating the contours.
And
I would avoid shapefiles and use GeoPackage as output format.

There is a button "Edit on GitHub" in the top-right corner on page
https://gdal.org/programs/gdal_contour.html, feel free to test it.

-Jukka Rahkonen-





--
Sent from: http://osgeo-org.1560.x6.nabble.com/GDAL-Dev-f3742093.html
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev



---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] Question on gdal_contour and limiting the range of contours generated.

2019-10-30 Thread Stephen Woodbridge

Hi,

I want to use gdal_contour to generate shapefile contours which is 
pretty straight forward except I'm working with coastal relief data that 
has both elevation and bathymetry. I only want contours for the bathymetry.


I tried setting the -off -5 and -i -5 with the idea that this would 
start me at a depth of -5 and generate contours every -5 units deeper, 
but the generated an empty files. Seems like -i parameter can not be 
negative (should be mentioned in the docs).


This sets the depth to 400ft and generates contours every 5 ft above that:

gdal_contour -off -121.92 -i 1.524 bathy-value/crm_vol2.nc.tif 
bathy-contours/crm_vol2.nc-5.shp


How can I stop/limit the process and stop at 0 (ie: sea level)? I don't 
want land elevation contours. Like a -limit 0 argument?


If this can't be done, it would be a nice option to have so you can 
breakup contours into multiple shapefiles to avoid overflowing them


Thanks,
  -Steve

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] Problems with masking while create a color-relief from a DEM

2019-10-24 Thread Stephen Woodbridge

Hi,

I have a collection of region netCDF DEM files that overlap and have a 
nodata=-nan. I have been able to convert them to color-relief using 
gdaldem but somewhere in the processing chain I seem to be loosing the 
nodata/mask/alpha info and the nodata in the color-relief is opaque, ie: 
where the DEMs overlap and there is nodata the later image blocks out 
the earlier image rather than being transparent.


Running: GDAL 2.2.2, released 2017/09/15 on Ubuntu xenial

Processing is like:

gdalwarp -s_srs EPSG:4269 -t_srs EPSG:4326 -srcnodata \-nan -dstnodata 
3 -ot Int16 -of GTiff -co TILED=YES crm_vol1.nc crm_vol1.nc.tif


and I have tried variants like:

-srcnodata -32768 or  -- I get a lot of -32768 in the conversion to Int16
-dstnodata 3  -- let gdal sense the nodata value
instead of -srcnodata \-nan -dstnodata 3

gdaldem color-relief crm_vol1.nc.tif ../dem-colors.txt 
crm_vol1.nc.colored.tif -of GTiff -co TILED=YES -alpha


also run without -alpha

Anyway, I have 10 region DEMs and then I combine them in a VRT like

buildvrt color-relief.vrt *.colored.tif

Then I serve that via mapserver. I get the same image if I just do:

gdal_translate -of PNG color-relief.vrt test.png

-Steve

woodbri@u19589217:~/work/bathymetry/coastal-relief$ gdalwarp -s_srs 
EPSG:4269 -t_srs EPSG:4326 -srcnodata -32768 -ot Int16 -of GTiff -co 
TILED=YES crm_vol1.nc crm_vol1.nc.tif

Creating output file that is 19201P x 9601L.
Processing input file crm_vol1.nc.
Copying nodata values from source crm_vol1.nc to destination 
crm_vol1.nc.tif.

0...10...20...30...40...50...60...70...80...90...100 - done.

woodbri@u19589217:~/work/bathymetry/coastal-relief$ gdalinfo crm_vol1.nc.tif
Driver: GTiff/GeoTIFF
Files: crm_vol1.nc.tif
Size is 19201, 9601
Coordinate System is:
GEOGCS["WGS 84",
    DATUM["WGS_1984",
    SPHEROID["WGS 84",6378137,298.257223563,
    AUTHORITY["EPSG","7030"]],
    AUTHORITY["EPSG","6326"]],
    PRIMEM["Greenwich",0],
    UNIT["degree",0.0174532925199433],
    AUTHORITY["EPSG","4326"]]
Origin = (-80.0004166,48.0004166)
Pixel Size = (0.0008333,-0.0008333)
Metadata:
  AREA_OR_POINT=Area
  NC_GLOBAL#Conventions=CF-1.4
  NC_GLOBAL#geospatial_lat_max=48
  NC_GLOBAL#geospatial_lat_min=40
  NC_GLOBAL#geospatial_lat_resolution=0.000833
  NC_GLOBAL#geospatial_lat_units=degrees_north
  NC_GLOBAL#geospatial_lon_max=-64
  NC_GLOBAL#geospatial_lon_min=-80
  NC_GLOBAL#geospatial_lon_resolution=0.000833
  NC_GLOBAL#geospatial_lon_units=degrees_east
  NC_GLOBAL#GMT_version=4.5.1 [64-bit]
  NC_GLOBAL#history=xyz2grd -R-80/-64/40/48 -I3c -Gcrm_v1.grd
  NC_GLOBAL#title=crm_v1.grd
  x#actual_range={-80,-64}
  x#long_name=x
  x#units=degrees_east
  x#_CoordinateAxisType=Lon
  y#actual_range={40,48}
  y#long_name=y
  y#units=degrees_north
  y#_CoordinateAxisType=Lat
  z#actual_range={-2754.39990234375,1903}
  z#long_name=z
  z#positive=up
  z#units=meters
  z#_FillValue=-nan
Image Structure Metadata:
  INTERLEAVE=BAND
Corner Coordinates:
Upper Left  ( -80.0004167,  48.0004167) ( 80d 0' 1.50"W, 48d 0' 1.50"N)
Lower Left  ( -80.0004167,  39.9995833) ( 80d 0' 1.50"W, 39d59'58.50"N)
Upper Right ( -63.9995833,  48.0004167) ( 63d59'58.50"W, 48d 0' 1.50"N)
Lower Right ( -63.9995833,  39.9995833) ( 63d59'58.50"W, 39d59'58.50"N)
Center  ( -72.000,  44.000) ( 72d 0' 0.00"W, 44d 0' 0.00"N)
Band 1 Block=256x256 Type=Int16, ColorInterp=Gray
  NoData Value=-32768
  Unit Type: meters
  Metadata:
    actual_range={-2754.39990234375,1903}
    long_name=z
    NETCDF_VARNAME=z
    positive=up
    units=meters
    _FillValue=-nan
woodbri@u19589217:~/work/bathymetry/coastal-relief$ gdalinfo 
crm_vol1.nc.tif -hist

Driver: GTiff/GeoTIFF
Files: crm_vol1.nc.tif
Size is 19201, 9601
Coordinate System is:
GEOGCS["WGS 84",
    DATUM["WGS_1984",
    SPHEROID["WGS 84",6378137,298.257223563,
    AUTHORITY["EPSG","7030"]],
    AUTHORITY["EPSG","6326"]],
    PRIMEM["Greenwich",0],
    UNIT["degree",0.0174532925199433],
    AUTHORITY["EPSG","4326"]]
Origin = (-80.0004166,48.0004166)
Pixel Size = (0.0008333,-0.0008333)
Metadata:
  AREA_OR_POINT=Area
  NC_GLOBAL#Conventions=CF-1.4
  NC_GLOBAL#geospatial_lat_max=48
  NC_GLOBAL#geospatial_lat_min=40
  NC_GLOBAL#geospatial_lat_resolution=0.000833
  NC_GLOBAL#geospatial_lat_units=degrees_north
  NC_GLOBAL#geospatial_lon_max=-64
  NC_GLOBAL#geospatial_lon_min=-80
  NC_GLOBAL#geospatial_lon_resolution=0.000833
  NC_GLOBAL#geospatial_lon_units=degrees_east
  NC_GLOBAL#GMT_version=4.5.1 [64-bit]
  NC_GLOBAL#history=xyz2grd -R-80/-64/40/48 -I3c -Gcrm_v1.grd
  NC_GLOBAL#title=crm_v1.grd
  x#actual_range={-80,-64}
  x#long_name=x
  x#units=degrees_east
  x#_CoordinateAxisType=Lon
  y#actual_range={40,48}
  y#long_name=y
  y#units=degrees_north
  y#_CoordinateAxisType=Lat
  z#actual_range={-2754.39990234375,1903}
  z#long_name=z
  z#positive=up
  z#units=meters
  z#_FillValue=-nan
Image Structure Metadata:
  

Re: [gdal-dev] Question on converting netCDF to GTiff

2019-10-23 Thread Stephen Woodbridge

On 10/23/2019 1:54 PM, Even Rouault wrote:

On mercredi 23 octobre 2019 13:41:45 CEST Stephen Woodbridge wrote:

Hi,

I have a netCDF grid of Float32 values that I want to convert to GTiff
with Int16 values. The min/max values will support this without scaling,
but I'm not sure what will happen to the NODATA value =
9.96920996838686905e+36.

Ideally, I would like to set it to something like 32000 but not sure how
to do that.

I've gotten as far as:

gdal_translate -of GTiff -co TILED=YES -ot Int16 -a_srs EPSG:4326
GEBCO_2019.nc  gebco_2019.tif

Which runs and looks ok except NODATA is set the same and that value
clearly will not fit in an Int16. Now it is possible that the grid
doesn't actually have any nodata cells set and only has the value
defined but I'm not sure how to check if that is the case.

To check if there are cells set to NODATA, with recent enough GDAL (3.0 I
think), with gdalinfo -stats, you'll see a STATISTICS_VALID_PERCENT item. If
it is not 100, then there are nodata cells.

To do nodata value remapping, use gdalwarp -srcnodata XX -dstnodata YY

Even


Thanks gdalwarp does the job nicely.

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] Question on converting netCDF to GTiff

2019-10-23 Thread Stephen Woodbridge

Hi,

I have a netCDF grid of Float32 values that I want to convert to GTiff 
with Int16 values. The min/max values will support this without scaling, 
but I'm not sure what will happen to the NODATA value = 
9.96920996838686905e+36.


Ideally, I would like to set it to something like 32000 but not sure how 
to do that.


I've gotten as far as:

gdal_translate -of GTiff -co TILED=YES -ot Int16 -a_srs EPSG:4326
GEBCO_2019.nc  gebco_2019.tif

Which runs and looks ok except NODATA is set the same and that value 
clearly will not fit in an Int16. Now it is possible that the grid 
doesn't actually have any nodata cells set and only has the value 
defined but I'm not sure how to check if that is the case.


Any thoughts appreciated.

Thanks,
  -Steve W


$ gdalinfo GEBCO_2019.nc -stats
Driver: netCDF/Network Common Data Format
Files: GEBCO_2019.nc
Size is 86400, 43200
Coordinate System is `'
Origin = (-180.000,90.000)
Pixel Size = (0.0041667,-0.0041667)
Metadata:
  elevation#long_name=Elevation relative to sea level
  elevation#sdn_parameter_name=Sea floor height (above mean sea level) 
{bathymetric height}

  elevation#sdn_parameter_urn=SDN:P01::ALATZZ01
  elevation#sdn_uom_name=Metres
  elevation#sdn_uom_urn=SDN:P06::ULAA
  elevation#standard_name=height_above_reference_ellipsoid
  elevation#units=m
  lat#axis=Y
  lat#long_name=latitude
  lat#sdn_parameter_name=Latitude north
  lat#sdn_parameter_urn=SDN:P01::ALATZZ01
  lat#sdn_uom_name=Degrees north
  lat#sdn_uom_urn=SDN:P06::DEGN
  lat#standard_name=latitude
  lat#units=degrees_north
  lon#axis=X
  lon#long_name=longitude
  lon#sdn_parameter_name=Longitude east
  lon#sdn_parameter_urn=SDN:P01::ALONZZ01
  lon#sdn_uom_name=Degrees east
  lon#sdn_uom_urn=SDN:P06::DEGE
  lon#standard_name=longitude
  lon#units=degrees_east
  NC_GLOBAL#comment=The data in the GEBCO_2019 Grid should not be used 
for navigation or any purpose relating to safety at sea.

  NC_GLOBAL#Conventions=CF-1.6
  NC_GLOBAL#history=Information on the development of the data set and 
the source data sets included in the grid can be found in the data set 
documentation available from https://www.gebco.net
  NC_GLOBAL#institution=On behalf of the General Bathymetric Chart of 
the Oceans (GEBCO), the data are held at the British Oceanographic Data 
Centre (BODC).

  NC_GLOBAL#node_offset=1
  NC_GLOBAL#references=DOI: 10.5285/836f016a-33be-6ddc-e053-6c86abc0788e
  NC_GLOBAL#source=The GEBCO_2019 Grid is the latest global bathymetric 
product released by the General Bathymetric Chart of the Oceans (GEBCO) 
and has been developed through the Nippon Foundation-GEBCO Seabed 2030 
Project. This is a collaborative project between the Nippon Foundation 
of Japan and GEBCO. The Seabed 2030 Project aims to bring together all 
available bathymetric data to produce the definitive map of the world 
ocean floor and make it available to all.
  NC_GLOBAL#title=The GEBCO_2019 Grid - a continuous terrain model for 
oceans and land at 15 arc-second intervals

Corner Coordinates:
Upper Left  (-180.000,  90.000)
Lower Left  (-180.000, -90.000)
Upper Right ( 180.000,  90.000)
Lower Right ( 180.000, -90.000)
Center  (   0.000,   0.000)
Band 1 Block=86400x1 Type=Float32, ColorInterp=Undefined
  Minimum=-10880.588, Maximum=8613.156, Mean=-1895.032, StdDev=2656.923
  NoData Value=9.96920996838686905e+36
  Unit Type: m
  Metadata:
    long_name=Elevation relative to sea level
    NETCDF_VARNAME=elevation
    sdn_parameter_name=Sea floor height (above mean sea level) 
{bathymetric height}

    sdn_parameter_urn=SDN:P01::ALATZZ01
    sdn_uom_name=Metres
    sdn_uom_urn=SDN:P06::ULAA
    standard_name=height_above_reference_ellipsoid
    STATISTICS_MAXIMUM=8613.15625
    STATISTICS_MEAN=-1895.0320141561
    STATISTICS_MINIMUM=-10880.587890625
    STATISTICS_STDDEV=2656.9226774517
    units=m


---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] vrt guidance: 31 band classified raster

2019-10-03 Thread Stephen Woodbridge

Matt,

Have you tried create a cascading VRT, where the first VRT composites 
all the GTiff files into a single virtual raster and then reference the 
composite in another VRT that assigns colortable. I think this should 
work nicely.


-Steve W

On 10/3/2019 7:01 PM, matt.wil...@gov.yk.ca wrote:


Hello gdal-dev, it's been a long time!  I'm happy to be digging into 
raster data building again for a change, but could use some nudges in 
the right direction(s).


A few weeks ago Nasa released Landsat-derived Annual Dominant Land 
Cover Across ABoVE Core Domain, 1984-2014 
.


It’s composed of 175 geotiff images, with each file containing 31 
bands which in turn correspond to a single year in the 1984-2014 
period. Each band is 8bit unsigned integer with values from 1 to 15 
and 255 as nodata. Each integer value corresponds to a class such as 
“Evergreen Forest”, “Herbaceous”, “Water” and so on.


I’ve been successful in manually building a VRT file using Category 
element for the classes and ColorTable entry for a palette – but only 
for a single band.


My question is: how to properly apply this to all bands? Do I need 
duplicate category and colortable elements to every single 
VRTRasterBand element or is there a smart way to define it once and 
then refer to it like a variable?  Am I even approaching this the 
right way?


It was quite a bit of work to get this far and I’m not looking forward 
to doing this 30 more times.


Sample vrt and simplified source is attached.

*Matt Wilkie*

Geomatics Analyst

Environment | Information Management & Technology

T 867-667-8133 | Yukon.ca 


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev



---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Is there an easy way to clip an image to the realdata?

2019-06-27 Thread Stephen Woodbridge

Hi Lars,

Thank you for sharing your ideas and scripts. These will come in handy 
for the future.
For this project I took a different approach of creating a master tiff 
and then copying the data from each patch into that master tiff. It 
turns  out that this was pretty straight forward in python given that 
each patch is masked array and by inverting the mask I end up with a 
mask of the good data that I can conditionally copy into the master. So 
now I have one image with all the real data from 100+ patches which I 
was trying to get using a vrt.


Come to think of it, it might be possible to get the BBOX of the 
inverted mask, and then use that to slice the 2d numpy array and just 
output the sliced image. Something to play with later.


Thanks again,
  -Steve

On 6/27/2019 10:30 AM, lars.schylb...@blixtmail.se wrote:

Hi Steve,

Sorry for the late reply!

I have some scripts that I developed this winter that contains a function
to clip an image or images to the parts that only contains data.

My aim was to create tighter tile indexes for Mapserver.
I achieved good results for my use case.

This will be one of the topics that I will present in my talk at
FOSS4G in Bucharest in August.  I haven't yet prepared the details how to
explain what I did.  But here we go:

1) First I retile the data to suitable size,
2) then check for empty tiles that I throw away.

The function "trim_tile" is actually what You are asking for, I think.

What I do in the function trim_tile is to use:

1) gdal_calc to produce a binary image where I have data
2) polygonize where data exists
3) compare extents
4) clip data with gdal_translate with projwin argument, if polygon extent is 
smaller

All is done in a bash script, since it is my preferred language to do things 
fast.
I use gnu parallel to speed things up. In my case I had 16 cpus in the server.
Some versions of parallel requires the argument "--will-cite", others don't.

I made my own bash version of retiling that keeps masks.

Further notes is that the script is working on paletted tif 8 bit image.
I have a version for rbg images also.  The main difference is that the
empty image test looks like this in that case:

STDDEV=$(gdalinfo $tile -stats | \
grep STATISTICS_STDDEV | \
sed 's/STATISTICS_STDDEV=//g' | \
awk '{$1=$1;printf "%d", $1}')

if [ "$STDDEV" = "000" ]; then
/bin/rm $tile
fi

If You have tagged nodata values, those have to be removed with gdal_edit.py
before You run statistics and then reaplied afterwards with gdal_edit again.

This might hopefully give some inspiration for you how to tackle your
use case.

The full script can be found here:

https://gist.github.com/LarsSchy/9ecb31eb964dd83820c139b2f2769a7c

Have fun

Lars Schylberg


20 juni 2019 kl. 22:06, "Stephen Woodbridge"  
skrev:


Hi,

I'm working with VIIRS L3U images of L2P (level 2 patches). The L3U data
is gridded to the +-180 x +-90 but the patch only fills a small
percentage of that. I'm compositing a the patch for a day using a vrt.

Performance is an issue because all the patches cover the whole globe
instead of the extents of the patch.

Is there a way to automatically clip an image to the extents of the
patch, where all the rest of the image it filled with nodata? IE. remove
all rows and columns from the exterior edges that are all nodata and
stop when you hit real data.

-Steve

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] adding features to a layer/dataset...

2019-06-26 Thread Stephen Woodbridge

Shayne,

To solve your problem, given the constraints the Even explained, you 
probably need to do something like:


* program that is writing to shapefile
   - creates a socket or pipe
   - open the file for exclusive access
   - after it updates the shapefile
   - close the file
   - writes to the socket a message like "updated"

* program that need to read the shapefile
   - opens socket or pipe created by first program
   - waits on read from socket for a message
   - on getting "updated" message
   - opens the shapefile for read
   - reads the data, then closes the file
   - processes the shape data
   - loops back to wait on read from socket

You also need a shared network filesystem that supports opening the 
shapefile exclusively for both read and write operations so the file is 
not overwritten while it is being accessed.


All that said, this is not really a GDAL issue, and you probably need to 
get advice from a network programming expert because there are issues 
like handling trying to open the file when its locked, stale locks, 
network failures, etc.


-Steve

On 6/26/2019 4:00 PM, Even Rouault wrote:

Shayne,


I have a vector GDALDataset that is shared between two applications running
on different machines. Both applications call GDALOpenEx(...) to open the
shapefile.shp with flags GDAL_OF_VECTOR and GDAL_OF_SHARED flags set to
open the dataset. The dataset only has one layer associated with it and the
shapefile format is the ESRI shapefile.


GDAL_OF_SHARED is for sharing the same dataset handle in the same process, not
for sharing datasets from several machines.


After opening the dataset in both applications, the first application
creates a new feature and adds it to the layer. The resulting shapefile on
disk shows that this is working when I inspect the contents of the
shapefile with ogrinfo -al shapefile.shp. The new feature has been added as
expected with a feature count = 1.

Is there a way to update the second application to get the changes to the
layer made by the first application? I've tried doing layer->SyncToDisk()
in the second application but this has no affect. Do I have to close the
layer/dataset and reopen it in the second application to get the changes
made by the first application?

Yes, GDAL doesn't handle synchronization/concurrent edition of shapefiles (or
any other file based formats), so if a file has been changed outside of GDAL,
you have to close and reopen the dataset to be sure that GDAL sees the changes
(sometimes you might see them, or some of them, while keeping the dataset
open, but it would be too fragile to rely on that)

Even




---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Help with band pixel functions

2019-06-24 Thread Stephen Woodbridge

Hi Adriana,

Here are a few ideas:

import numpy as np

ima = gdal.Open(filea, 0)
banda = ima.GetRasterBand(1)
dataa = banda.ReadAsArray()

print type(dataa), dataa.shape, isinstance(dataa, np.ma.MaskedArray)

imb = gdal.Open(fileb, 0)
bandb = imb.GetRasterBand(1)
datab = bandb.ReadAsArray()

print type(datab), datab.shape, isinstance(datab, np.ma.MaskedArray)

So if you have masked arrays, where the fill_value=255, ie: your nodata 
value


The mask is True for nodata so

weights = dataa.mask | datab.mask
np.average([dataa, datab], axis=0, weights=weights)

You might need to invert the weights in which case try weights=~weights 
instead.


You might need to do this in three operations using appropriate masks
1) copy dataa values that are not masked but are in datab
2) copy datab values that are not masked but are in dataa
3) to copy the average of dataa and datab where they are both not masked.

Another option might be to use numpy.vectorize() to apply a function to 
each cell in an array:

https://docs.scipy.org/doc/numpy/reference/generated/numpy.vectorize.html

-Steve

On 6/24/2019 10:38 PM, Adriana Parra wrote:

Hello,

I have been trying to mosaic two images that partially overlap using 
band pixel functions in Python. My images contain unsigned integers 
(uint8) and the no data value is 255. I considered using the nanmean 
function so that I would get average values in overlapping areas of 
the two images and maintain the original value of the non-overlapping 
areas, but for some reason the NAN values are being interpreted as 
zero, resulting in half of the original value in non-overlapping 
areas.  I would really appreciate any feedback. I am using GDAL 3.1.0 
and python 3.6. Bellow is the code I used and I am sending attached 
the input files.
gdalbuildvrt  mosaic1.vrt -input_file_list  input_files.txt -srcnodata 
"255"


 import numpy as np
def average(in_ar, out_ar, xoff, yoff, xsize, ysize, 
raster_xsize,raster_ysize, buf_radius, gt, **kwargs):

    np.round_(np.clip(np.nanmean(in_ar, axis = 0, dtype = 'uint8'),0,255),
              out = out_ar)

gdal_translate --config GDAL_VRT_ENABLE_PYTHON YES mosaic1.vrt mosaic1.tif

Mosaic1.jpeg 

raw_files.jpeg 

test_files.7z 





___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev



---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] How do I get lat long extents from this Ogrinfo output on a .shp?

2019-06-20 Thread Stephen Woodbridge

Hi Paul,

Its not just the extents but all the data in the file is in a UTM 
projection. You can reproject the file to WGS84 and then ogrinfo will 
report in long-lat.


ogr2ogr -t_srs EPSG:4326 OldStreamsPolyline-wgs84.shp OldStreamsPolyline.shp
ogrinfo -so OldStreamsPolyline-wgs84.shp OldStreamsPolyline-wgs84

-Steve W

On 6/20/2019 4:26 PM, Dante, Paul wrote:


I ran ogrinfo on a shape file and got output. How do I convert the 
extent listed below into lat long coordinates? It looks like the units 
are meters, but I’m not sure how that will translate to lat long for 
creating a bounding box:


INFO: Open of `OldStreamsPolyline.shp'

  using driver `ESRI Shapefile' successful.

Layer name: OldStreamsPolyline

Metadata:

DBF_DATE_LAST_UPDATE=2011-02-17

Geometry: Line String

Feature Count: 179

Extent: (482246.857278, 5449862.717375) - (498293.801839, 5462019.990072)

Layer SRS WKT:

PROJCS["NAD83 / UTM zone 10N",

GEOGCS["NAD83",

DATUM["North_American_Datum_1983",

SPHEROID["GRS 1980",6378137,298.257222101,

AUTHORITY["EPSG","7019"]],

TOWGS84[0,0,0,0,0,0,0],

AUTHORITY["EPSG","6269"]],

PRIMEM["Greenwich",0,

  AUTHORITY["EPSG","8901"]],

UNIT["degree",0.0174532925199433,

AUTHORITY["EPSG","9122"]],

AUTHORITY["EPSG","4269"]],

PROJECTION["Transverse_Mercator"],

PARAMETER["latitude_of_origin",0],

PARAMETER["central_meridian",-123],

PARAMETER["scale_factor",0.9996],

PARAMETER["false_easting",50],

PARAMETER["false_northing",0],

UNIT["metre",1,

AUTHORITY["EPSG","9001"]],

AXIS["Easting",EAST],

AXIS["Northing",NORTH],

AUTHORITY["EPSG","26910"]]

Id: Integer (6.0)

Name: String (50.0)

Region: String (2.0)

Type: String (50.0)

Paul Dante

Software Developer - Geodisy

Walter C. Koerner Library | UBC Library

The University of British Columbia | Vancouver Campus
219-1958 Main Mall | Vancouver BC | V6T 1Z2 Canada

Phone 604 827 5129

paul.da...@ubc.ca __

UBC Vancouver is located on the traditional, ancestral, unceded 
territory of the xʷməθkʷəy̓əm (Musqueam) people


cid:image001.png@01D42428.A8255E20


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev



---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] Is there an easy way to clip an image to the realdata?

2019-06-20 Thread Stephen Woodbridge

Hi,

I'm working with VIIRS L3U images of L2P (level 2 patches). The L3U data 
is gridded to the +-180 x +-90 but the patch only fills a small 
percentage of that. I'm compositing a the patch for a day using a vrt.


Performance is an issue because all the patches cover the whole globe 
instead of the extents of the patch.


Is there a way to automatically clip an image to the extents of the 
patch, where all the rest of the image it filled with nodata? IE. remove 
all rows and columns from the exterior edges that are all nodata and 
stop when you hit real data.


-Steve

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] Stuck with python gdal.Grid throwing an error

2019-06-14 Thread Stephen Woodbridge

Hi,

I can't figure out what I'm doing wrong but gdal.Grid() keeps throwing 
an error and I've run out of things to try and work around the issue.


On Ubuntu using these packages:
gdal-bin:amd64/xenial 2.2.2+dfsg-1~xenial1 uptodate
gdal-data:all/xenial 2.2.2+dfsg-1~xenial1 uptodate
libgdal1i:amd64/xenial 1.11.3+dfsg-3build2 uptodate
libgdal20:amd64/xenial 2.2.2+dfsg-1~xenial1 uptodate
python-gdal:amd64/xenial 2.2.2+dfsg-1~xenial1 uptodate

Error message is:

Traceback (most recent call last):
  File "./process-jason3-ssha.py", line 322, in 
    Main()
  File "./process-jason3-ssha.py", line 316, in Main
    processFiles(mf, urlList)
  File "./process-jason3-ssha.py", line 242, in processFiles
    err = gdal.Grid(dataset, ds, options=opts)
  File "/usr/lib/python2.7/dist-packages/osgeo/gdal.py", line 968, in Grid
    return GridInternal(destName, srcDS, opts, callback, callback_data)
  File "/usr/lib/python2.7/dist-packages/osgeo/gdal.py", line 3229, in 
GridInternal

    return _gdal.GridInternal(*args)
RuntimeError: not a string

Code looks like (edited out not esentials):

    track = Dataset(fn, "r")
    track.set_auto_mask(False)

    lat = track.variables['lat']
    lon = track.variables['lon']
    ssh = track.variables['ssha']

    cnt = lat.shape[0]

    ds = ogr.GetDriverByName('Memory').CreateDataSource('wrk')
    lyr = ds.CreateLayer('track', srs=SRS)

    opts = 
gdal.GridOptions(algorithm='linear:radius=50:nodata=32767', \

    layers='track')

    # QUESTION: should I create point features for NoData track 
points??


    for n in range(cnt):
    wkt = "POINT({} {} {})".format(lon[n]-180., lat[n], ssh[n])
    feat = ogr.Feature(lyr.GetLayerDefn())
    feat.SetGeometryDirectly(ogr.Geometry(wkt=wkt))
    lyr.CreateFeature(feat)

    # dataset is a gtiff opened with GA_Update

    # also tried this
    #err = gdal.Grid(dataset, lyr, options=opts)

    err = gdal.Grid(dataset, ds, options=opts)
    if err != 0:
    print 'gdalGrid error: ', err
    return

The above code is basically the same as what I use for 
gdal.RasterizeLayer() and that worked fine.


-Steve

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Question on python gdal.Rasterize()

2019-06-14 Thread Stephen Woodbridge
OK, After some more research, I found the rasterize.py autotest file 
which was a huge help and I have code working that burns the tracks into 
an image. But its not exactly what I hoping for so I'll give the 
gdal.Grid() a try.


Thanks,
  -Steve

On 6/14/2019 1:58 PM, Stephen Woodbridge wrote:

Hi all,

My goal is to take satellite track data and create a gtiff file using 
Python.
The satellite data is in a NetCDF file, which I can read in Python and 
has variables lat, lon, ssha. There are a continue stream of the 
NetCDF data over time so I plan to just keep loading them as they 
become available.


My thought is to take adjacent track points as a LineStringZ and burn 
them into the image overwriting any existing data. This will keep the 
gtiff up to date with the most current data for any given area.


Maybe there is a better way to do this? suggestions welcome.

I was looking at gdal.RasterizeOptions() and gdal.Rasterize() and my 
thought was that I could create feature with geom like (lon, lat, 
ssha) and then rasterize that into the image using the option 
"useZ=True" but this seems to imply that I need to have an ogr source 
for the vector data. It would be much more convenient to be able to 
just pass features from python to the Rasterize() function.


I see callback and callback_data in the options can these be used for 
that? How?


Rasterize(destNameOrDestDS, srcDS, **kwargs)

I can open the gtiff file and then pass the handle of that to 
destNameOrDestDS.
Can a create an in memory srcDS that I've loaded with the track 
segments. The NetCDF files have about 5-6000 track points in them so 
an eqivalent number of LineStringZ features.


Is there an example of something similar you can link me to.

Thanks,
  -Steve



---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] Question on python gdal.Rasterize()

2019-06-14 Thread Stephen Woodbridge

Hi all,

My goal is to take satellite track data and create a gtiff file using 
Python.
The satellite data is in a NetCDF file, which I can read in Python and 
has variables lat, lon, ssha. There are a continue stream of the NetCDF 
data over time so I plan to just keep loading them as they become available.


My thought is to take adjacent track points as a LineStringZ and burn 
them into the image overwriting any existing data. This will keep the 
gtiff up to date with the most current data for any given area.


Maybe there is a better way to do this? suggestions welcome.

I was looking at gdal.RasterizeOptions() and gdal.Rasterize() and my 
thought was that I could create feature with geom like (lon, lat, ssha) 
and then rasterize that into the image using the option "useZ=True" but 
this seems to imply that I need to have an ogr source for the vector 
data. It would be much more convenient to be able to just pass features 
from python to the Rasterize() function.


I see callback and callback_data in the options can these be used for 
that? How?


Rasterize(destNameOrDestDS, srcDS, **kwargs)

I can open the gtiff file and then pass the handle of that to 
destNameOrDestDS.
Can a create an in memory srcDS that I've loaded with the track 
segments. The NetCDF files have about 5-6000 track points in them so an 
eqivalent number of LineStringZ features.


Is there an example of something similar you can link me to.

Thanks,
  -Steve

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Question of file ordering in VRT

2019-03-07 Thread Stephen Woodbridge

Thanks Even and Ivan,

Good idea to build to build the list and use that for gdalbuildvrt.

-Steve

On 3/7/2019 11:42 AM, Even Rouault wrote:

On jeudi 7 mars 2019 11:30:04 CET Stephen Woodbridge wrote:

Hi all,

When I have multiple overlapping files in VRT which pixel takes
president? I'm assuming that the later files in the VRT would be the one
presented. For example, I have multiple satellite images with names like
MMDDHHMM-.tif so when I use

gdalbuiltvrt MMDD.vrt MMDD*.tif

and then view that, in say mapserver, am I seeing the newest images or
the oldest image when 2+ images overlap in any given area?

Steve,

The last source mentionned in the VRT will take precedence over the previous
overlapping ones.

I'm not sure however that the wildcard expansion will guarantee that the files
will be sorted.

Even




---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] Question of file ordering in VRT

2019-03-07 Thread Stephen Woodbridge

Hi all,

When I have multiple overlapping files in VRT which pixel takes 
president? I'm assuming that the later files in the VRT would be the one 
presented. For example, I have multiple satellite images with names like 
MMDDHHMM-.tif so when I use


gdalbuiltvrt MMDD.vrt MMDD*.tif

and then view that, in say mapserver, am I seeing the newest images or 
the oldest image when 2+ images overlap in any given area?


Thanks,
  -Steve

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] NOAA satellite swath NetCDF file to georeferenced Gtiff?

2019-03-05 Thread Stephen Woodbridge

Hi All,

Can someone please give me a hint on how to convert a NOAA satellite 
swath NetCDF file to georeferenced Gtiff?

I found these:
  https://trac.osgeo.org/gdal/ticket/4513
  https://trac.osgeo.org/gdal/wiki/rfc4_geolocate
which implies it cannot be done directly with gdal yet.
Is there another way to this maybe using Python if gdal can't handle it?

-Steve W

$ gdalinfo --version
GDAL 1.11.3, released 2015/09/16

$ gdalinfo 
NETCDF:2019030420-OSPO-L2P_GHRSST-SSTsubskin-AVHRR18_G-ACSPO_V2.41-v02.0-fv01.0.nc:sea_surface_temperature

Warning 1: dimension #2 (ni) is not a Longitude/X dimension.
Warning 1: dimension #1 (nj) is not a Latitude/Y dimension.
Driver: netCDF/Network Common Data Format
Files: 
2019030420-OSPO-L2P_GHRSST-SSTsubskin-AVHRR18_G-ACSPO_V2.41-v02.0-fv01.0.nc

Size is 409, 7200
Coordinate System is `'
Metadata:
  NC_GLOBAL#acknowledgment=Please acknowledge the use of these data 
with the following statement: These data were provided by Group for High 
Resolution Sea Surface Temperature (GHRSST) and the National Oceanic and 
Atmospheric Administration (NOAA).

  NC_GLOBAL#cdm_data_type=swath
  NC_GLOBAL#comment=none
  NC_GLOBAL#Conventions=CF-1.6
  NC_GLOBAL#creator_email=alex.igna...@noaa.gov
  NC_GLOBAL#creator_name=Alex Ignatov
  NC_GLOBAL#creator_url=http://www.star.nesdis.noaa.gov
  NC_GLOBAL#date_created=20190304T214051Z
  NC_GLOBAL#easternmost_longitude=180
  NC_GLOBAL#file_quality_level=3
  NC_GLOBAL#gds_version_id=02.0
  NC_GLOBAL#geospatial_bounds=POLYGON((   3.520  14.209,  176.005 
20.284, -156.649  15.860,  -23.493   9.963,    3.520  14.209))

  NC_GLOBAL#geospatial_first_scanline_first_fov_lat=20.283962
  NC_GLOBAL#geospatial_first_scanline_first_fov_lon=176.00494
  NC_GLOBAL#geospatial_first_scanline_last_fov_lat=15.859998
  NC_GLOBAL#geospatial_first_scanline_last_fov_lon=-156.64908
  NC_GLOBAL#geospatial_last_scanline_first_fov_lat=14.20918
  NC_GLOBAL#geospatial_last_scanline_first_fov_lon=3.5202866
  NC_GLOBAL#geospatial_last_scanline_last_fov_lat=9.9625263
  NC_GLOBAL#geospatial_last_scanline_last_fov_lon=-23.492666
  NC_GLOBAL#geospatial_lat_resolution=0.009899
  NC_GLOBAL#geospatial_lat_units=degrees_north
  NC_GLOBAL#geospatial_lon_resolution=0.009899
  NC_GLOBAL#geospatial_lon_units=degrees_east
  NC_GLOBAL#history=Created by Advanced Clear-Sky Processor for Oceans 
(ACSPO)-AVHRR at NOAA/NESDIS/OSPO.

  NC_GLOBAL#id=AVHRR18_G-OSPO-L2P-v2.41
  NC_GLOBAL#institution=NOAA/NESDIS/OSPO
  NC_GLOBAL#keywords=Oceans > Ocean Temperature > Sea Surface Temperature
  NC_GLOBAL#keywords_vocabulary=NASA Global Change Master Directory 
(GCMD) Science Keywords

  NC_GLOBAL#license=GHRSST protocol describes data use as free and open
  NC_GLOBAL#Metadata_Conventions=Unidata Dataset Discovery v1.0
NC_GLOBAL#metadata_link=http://podaac.jpl.nasa.gov/ws/metadata/dataset/?format=iso=AVHRR18_G-OSPO-L2P-v2.41
  NC_GLOBAL#naming_authority=org.ghrsst
  NC_GLOBAL#netcdf_version_id=4.3.2 of Sep 24 2015 08:51:38 $
  NC_GLOBAL#northernmost_latitude=20.283962
  NC_GLOBAL#platform=NOAA-18
  NC_GLOBAL#processing_level=L2P
  NC_GLOBAL#product_version=2.41
  NC_GLOBAL#project=Group for High Resolution Sea Surface Temperature
  NC_GLOBAL#publisher_email=ghrsst...@nceo.ac.uk
  NC_GLOBAL#publisher_name=The GHRSST Project Office
  NC_GLOBAL#publisher_url=http://www.ghrsst.org
  NC_GLOBAL#references=Data convention: GHRSST Data Specification (GDS) 
v2.0. Algorithms: ACSPO-AVHRR ATBD (NOAA/NESDIS/STAR)

  NC_GLOBAL#sensor=AVHRR_GAC
NC_GLOBAL#source=AVHRR_L1b,CMC0.2deg-CMC-L4-GLOB-v2.0,NOAA-NCEP-GFS
  NC_GLOBAL#southernmost_latitude=-90
  NC_GLOBAL#spatial_resolution=1.1 km at nadir
  NC_GLOBAL#standard_name_vocabulary=CF Standard Name Table (v26, 08 
November 2013)

  NC_GLOBAL#start_time=20190304T20Z
  NC_GLOBAL#stop_time=20190304T205959Z
  NC_GLOBAL#summary=Sea surface temperature retrievals produced by 
NOAA/NESDIS/OSPO office from AVHRR sensor

  NC_GLOBAL#time_coverage_end=20190304T205959Z
  NC_GLOBAL#time_coverage_start=20190304T20Z
  NC_GLOBAL#title=AVHRR L2P SST
  NC_GLOBAL#uuid=2eaa373c-3ec6-11e9-ac15-f9642917379f
  NC_GLOBAL#westernmost_longitude=-180
  NETCDF_DIM_EXTRA={time}
  NETCDF_DIM_time_DEF={1,4}
  NETCDF_DIM_time_VALUES=1204574400
  sea_surface_temperature#_FillValue=-32768
  sea_surface_temperature#add_offset=273.14999
  sea_surface_temperature#comment=SST obtained by regression with buoy 
measurements

  sea_surface_temperature#coordinates=lon lat
  sea_surface_temperature#long_name=sea surface skin temperature
  sea_surface_temperature#scale_factor=0.009998
  sea_surface_temperature#source=NOAA
sea_surface_temperature#standard_name=sea_surface_skin_temperature
  sea_surface_temperature#units=kelvin
  sea_surface_temperature#valid_max=32767
  sea_surface_temperature#valid_min=-32767
  time#axis=T
  time#calendar=Gregorian
  time#comment=seconds since 1981-01-01 00:00:00
  time#long_name=reference time of sst file
  time#standard_name=time
  

Re: [gdal-dev] gdalwarp ERROR 6: PHOTOMETRIC=YCBCR requires a source raster with only 3 bands (RGB)

2018-07-01 Thread Stephen Woodbridge
OK, digging deep into my old emails I think I found a solution ( still 
to be proved):


gdalwarp -t_srs EPSG:4326 -r bilinear -co TILED=YES -co BIGTIFF=YES 
-dstalpha -multi  '00010.jp2' '/data/satmap/tmp.tif'


This took 39 min and expanded the 2.1G jp2 into 87G tif, then this gets 
followed by:


gdal_translate -co TILED=YES -co JPEG_QUALITY=90 -co COMPRESS=JPEG -co 
PHOTOMETRIC=YCBCR -b 1 -b 2 -b 3 -mask 4 --config 
GDAL_TIFF_INTERNAL_MASK YES /data/satmap/tmp.tif /data/satmap/tmp/00010.tif


Which takes 53 min and compresses into a 1.8G tif. And then add overviews:

gdaladdo 00010.tif 2 4 8 18 32 64 128 256

Which takes 51 min and the file size goes to 2.6G.

This two step process takes a lot of disk and processing time. It would 
seem that this could be optimized if the warp and translate 
functionality where merged into a single command.


-Steve

On 7/1/2018 11:18 AM, Stephen Woodbridge wrote:

Right, but the point 1 is the message says:
ERROR 6: PHOTOMETRIC=YCBCR requires a source raster with only 3 bands 
(RGB)
ie: requires a source raster with only 3 bands (RGB) and the source 
raster has only 3 bands!
-dstalpha if I understand correctly is to create a alpha band in the 
output file. So the error message is misleading at best.


Point 2 is how does one translate a utm to wgs84 using YCBCR and 
generate an appropriate mask or alpha channel to hide the pixels that 
are outside the transformed image. I suppose it needs to be done in 
multiple steps of warp and translate in some specific order with 
specific options but its not obvious how.


-Steve

On 7/1/2018 10:14 AM, Michael Smith wrote:

Steve,

Setting that alpha channel makes it 4 bands. RGB + alpha channel. 
Either remove the ycbcr or the alpha channel.


Michael Smith
Remote Sensing/GIS Center
US Army Corps of Engineers

On Jul 1, 2018, at 10:07 AM, Stephen Woodbridge 
 wrote:


Hi all,

I'm trying to convert a utm jp2 file to a wgs84 tif file and getting 
"ERROR 6: PHOTOMETRIC=YCBCR requires a source raster with only 3 
bands (RGB)" but the src file has only 3 RBG bands.


gdalwarp -t_srs EPSG:4326 -dstalpha -r bilinear -of GTiff -co 
BIGTIFF=YES -co TILED=YES -co COMPRESS=JPEG -co JPEG_QUALITY=90 -co 
PHOTOMETRIC=YCBCR '00010.jp2' 'tif/00010.tif'


The command works if I remove -dstalpha so it seems that the error 
message is erroneous and might need fixed. Regardless, I don't want 
the transformed image to have black around it, so how do I do that?


-Steve

# gdalinfo 00010.jp2
Driver: JP2OpenJPEG/JPEG-2000 driver based on OpenJPEG library
Files: 00010.jp2
    00010.jp2.aux.xml
Size is 340668, 65335
Coordinate System is:
PROJCS["UTM_Zone_38_Northern_Hemisphere",
 GEOGCS["GCS_WGS_1984",
 DATUM["WGS84",
 SPHEROID["WGS84",6378137,298.257223563]],
 PRIMEM["Greenwich",0],
 UNIT["Degree",0.017453292519943295]],
 PROJECTION["Transverse_Mercator"],
 PARAMETER["latitude_of_origin",0],
 PARAMETER["central_meridian",45],
 PARAMETER["scale_factor",0.9996],
 PARAMETER["false_easting",50],
 PARAMETER["false_northing",0],
 UNIT["Meter",1]]
Origin = (265998.000,1848002.000)
Pixel Size = (3.000,-3.000)
Image Structure Metadata:
   INTERLEAVE=PIXEL
Corner Coordinates:
Upper Left  (  265998.000, 1848002.000) ( 42d48'19.81"E, 16d42'11.25"N)
Lower Left  (  265998.000, 1651997.000) ( 42d49'28.30"E, 14d55'56.73"N)
Upper Right ( 1288002.000, 1848002.000) ( 52d22'11.02"E, 16d35' 0.58"N)
Lower Right ( 1288002.000, 1651997.000) ( 52d18'23.31"E, 14d49'33.87"N)
Center  (  777000.000, 174.500) ( 47d35' 9.06"E, 15d48'48.25"N)
Band 1 Block=1024x1024 Type=Byte, ColorInterp=Red
   Overviews: 170334x32667, 85167x16333, 42583x8166, 21291x4083, 
10645x2041

   Overviews: arbitrary
   Image Structure Metadata:
 COMPRESSION=JPEG2000
Band 2 Block=1024x1024 Type=Byte, ColorInterp=Green
   Overviews: 170334x32667, 85167x16333, 42583x8166, 21291x4083, 
10645x2041

   Overviews: arbitrary
   Image Structure Metadata:
 COMPRESSION=JPEG2000
Band 3 Block=1024x1024 Type=Byte, ColorInterp=Blue
   Overviews: 170334x32667, 85167x16333, 42583x8166, 21291x4083, 
10645x2041

   Overviews: arbitrary
   Image Structure Metadata:
 COMPRESSION=JPEG2000


---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev





___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] gdalwarp ERROR 6: PHOTOMETRIC=YCBCR requires a source raster with only 3 bands (RGB)

2018-07-01 Thread Stephen Woodbridge

Right, but the point 1 is the message says:
ERROR 6: PHOTOMETRIC=YCBCR requires a source raster with only 3 bands (RGB)
ie: requires a source raster with only 3 bands (RGB) and the source 
raster has only 3 bands!
-dstalpha if I understand correctly is to create a alpha band in the 
output file. So the error message is misleading at best.


Point 2 is how does one translate a utm to wgs84 using YCBCR and 
generate an appropriate mask or alpha channel to hide the pixels that 
are outside the transformed image. I suppose it needs to be done in 
multiple steps of warp and translate in some specific order with 
specific options but its not obvious how.


-Steve

On 7/1/2018 10:14 AM, Michael Smith wrote:

Steve,

Setting that alpha channel makes it 4 bands. RGB + alpha channel. Either remove 
the ycbcr or the alpha channel.

Michael Smith
Remote Sensing/GIS Center
US Army Corps of Engineers


On Jul 1, 2018, at 10:07 AM, Stephen Woodbridge  
wrote:

Hi all,

I'm trying to convert a utm jp2 file to a wgs84 tif file and getting "ERROR 6: 
PHOTOMETRIC=YCBCR requires a source raster with only 3 bands (RGB)" but the src file 
has only 3 RBG bands.

gdalwarp -t_srs EPSG:4326 -dstalpha -r bilinear -of GTiff -co BIGTIFF=YES -co 
TILED=YES -co COMPRESS=JPEG -co JPEG_QUALITY=90 -co PHOTOMETRIC=YCBCR 
'00010.jp2' 'tif/00010.tif'

The command works if I remove -dstalpha so it seems that the error message is 
erroneous and might need fixed. Regardless, I don't want the transformed image 
to have black around it, so how do I do that?

-Steve

# gdalinfo 00010.jp2
Driver: JP2OpenJPEG/JPEG-2000 driver based on OpenJPEG library
Files: 00010.jp2
00010.jp2.aux.xml
Size is 340668, 65335
Coordinate System is:
PROJCS["UTM_Zone_38_Northern_Hemisphere",
 GEOGCS["GCS_WGS_1984",
 DATUM["WGS84",
 SPHEROID["WGS84",6378137,298.257223563]],
 PRIMEM["Greenwich",0],
 UNIT["Degree",0.017453292519943295]],
 PROJECTION["Transverse_Mercator"],
 PARAMETER["latitude_of_origin",0],
 PARAMETER["central_meridian",45],
 PARAMETER["scale_factor",0.9996],
 PARAMETER["false_easting",50],
 PARAMETER["false_northing",0],
 UNIT["Meter",1]]
Origin = (265998.000,1848002.000)
Pixel Size = (3.000,-3.000)
Image Structure Metadata:
   INTERLEAVE=PIXEL
Corner Coordinates:
Upper Left  (  265998.000, 1848002.000) ( 42d48'19.81"E, 16d42'11.25"N)
Lower Left  (  265998.000, 1651997.000) ( 42d49'28.30"E, 14d55'56.73"N)
Upper Right ( 1288002.000, 1848002.000) ( 52d22'11.02"E, 16d35' 0.58"N)
Lower Right ( 1288002.000, 1651997.000) ( 52d18'23.31"E, 14d49'33.87"N)
Center  (  777000.000, 174.500) ( 47d35' 9.06"E, 15d48'48.25"N)
Band 1 Block=1024x1024 Type=Byte, ColorInterp=Red
   Overviews: 170334x32667, 85167x16333, 42583x8166, 21291x4083, 10645x2041
   Overviews: arbitrary
   Image Structure Metadata:
 COMPRESSION=JPEG2000
Band 2 Block=1024x1024 Type=Byte, ColorInterp=Green
   Overviews: 170334x32667, 85167x16333, 42583x8166, 21291x4083, 10645x2041
   Overviews: arbitrary
   Image Structure Metadata:
 COMPRESSION=JPEG2000
Band 3 Block=1024x1024 Type=Byte, ColorInterp=Blue
   Overviews: 170334x32667, 85167x16333, 42583x8166, 21291x4083, 10645x2041
   Overviews: arbitrary
   Image Structure Metadata:
 COMPRESSION=JPEG2000


---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev



___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] gdalwarp ERROR 6: PHOTOMETRIC=YCBCR requires a source raster with only 3 bands (RGB)

2018-07-01 Thread Stephen Woodbridge

Hi all,

I'm trying to convert a utm jp2 file to a wgs84 tif file and getting 
"ERROR 6: PHOTOMETRIC=YCBCR requires a source raster with only 3 bands 
(RGB)" but the src file has only 3 RBG bands.


gdalwarp -t_srs EPSG:4326 -dstalpha -r bilinear -of GTiff -co 
BIGTIFF=YES -co TILED=YES -co COMPRESS=JPEG -co JPEG_QUALITY=90 -co 
PHOTOMETRIC=YCBCR '00010.jp2' 'tif/00010.tif'


The command works if I remove -dstalpha so it seems that the error 
message is erroneous and might need fixed. Regardless, I don't want the 
transformed image to have black around it, so how do I do that?


-Steve

# gdalinfo 00010.jp2
Driver: JP2OpenJPEG/JPEG-2000 driver based on OpenJPEG library
Files: 00010.jp2
   00010.jp2.aux.xml
Size is 340668, 65335
Coordinate System is:
PROJCS["UTM_Zone_38_Northern_Hemisphere",
    GEOGCS["GCS_WGS_1984",
    DATUM["WGS84",
    SPHEROID["WGS84",6378137,298.257223563]],
    PRIMEM["Greenwich",0],
    UNIT["Degree",0.017453292519943295]],
    PROJECTION["Transverse_Mercator"],
    PARAMETER["latitude_of_origin",0],
    PARAMETER["central_meridian",45],
    PARAMETER["scale_factor",0.9996],
    PARAMETER["false_easting",50],
    PARAMETER["false_northing",0],
    UNIT["Meter",1]]
Origin = (265998.000,1848002.000)
Pixel Size = (3.000,-3.000)
Image Structure Metadata:
  INTERLEAVE=PIXEL
Corner Coordinates:
Upper Left  (  265998.000, 1848002.000) ( 42d48'19.81"E, 16d42'11.25"N)
Lower Left  (  265998.000, 1651997.000) ( 42d49'28.30"E, 14d55'56.73"N)
Upper Right ( 1288002.000, 1848002.000) ( 52d22'11.02"E, 16d35' 0.58"N)
Lower Right ( 1288002.000, 1651997.000) ( 52d18'23.31"E, 14d49'33.87"N)
Center  (  777000.000, 174.500) ( 47d35' 9.06"E, 15d48'48.25"N)
Band 1 Block=1024x1024 Type=Byte, ColorInterp=Red
  Overviews: 170334x32667, 85167x16333, 42583x8166, 21291x4083, 10645x2041
  Overviews: arbitrary
  Image Structure Metadata:
    COMPRESSION=JPEG2000
Band 2 Block=1024x1024 Type=Byte, ColorInterp=Green
  Overviews: 170334x32667, 85167x16333, 42583x8166, 21291x4083, 10645x2041
  Overviews: arbitrary
  Image Structure Metadata:
    COMPRESSION=JPEG2000
Band 3 Block=1024x1024 Type=Byte, ColorInterp=Blue
  Overviews: 170334x32667, 85167x16333, 42583x8166, 21291x4083, 10645x2041
  Overviews: arbitrary
  Image Structure Metadata:
    COMPRESSION=JPEG2000


---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Having trouble working with jp2 files

2018-06-20 Thread Stephen Woodbridge

Even,

My bad, I ran the wrong container. Just reran the test and it works with 
gdal 2.2.4 and openjpeg 2.1.2. And this also worked:


root@ab86ce382ef0:/usr/local# gdal_translate -of GTiff -co TILED=YES -co 
PHOTOMETRIC=YCBCR -co COMPRESS=JPEG /data/850011.jp2 /data/850011.tif

Input file size is 29364, 25856
0...10...20...30...40...50...60...70...80...90...100 - done.

I also built openjpeg-master.zip and gdal 2.3.0 in the docker and that 
works also and is much much faster.


Thanks so much for your help!

Best regards,
  -Steve

On 6/20/2018 12:42 PM, Even Rouault wrote:

On mercredi 20 juin 2018 12:33:40 CEST Stephen Woodbridge wrote:

Even,

Thanks!

I did built gdal 2.2.4 with openjpeg 2.1.2 on a docker container and
that solves the simple case of:

gdalinfo -stats /data/850011.jp2

but it still fails with:

root@436a09289084:/usr/local# gdal_translate -of GTiff -co TILED=YES
/data/850011.jp2 /data/850011.tif

A bit strange, but I didn't actually test with 2.1.2

You can download a zip of openjpeg master sources at:
https://codeload.github.com/uclouvain/openjpeg/zip/master

And you'd better use GDAL 2.3.0 with it.

Even




---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Having trouble working with jp2 files

2018-06-20 Thread Stephen Woodbridge

Even,

Thanks!

I did built gdal 2.2.4 with openjpeg 2.1.2 on a docker container and 
that solves the simple case of:


gdalinfo -stats /data/850011.jp2

but it still fails with:

root@436a09289084:/usr/local# gdal_translate -of GTiff -co TILED=YES 
/data/850011.jp2 /data/850011.tif

Input file size is 29364, 25856
0ERROR 1: psImage->comps[0].data == NULL
ERROR 1: /data/850011.jp2, band 1: IReadBlock failed at X offset 0, Y 
offset 0

ERROR 1: GetBlockRef failed at X block offset 0, Y block offset 0

In my Dockerfile, I have:

# Compile and install OpenJPEG
RUN cd src && tar -xvf openjpeg-${OPENJPEG_VERSION}.tar.gz && cd 
openjpeg-${OPENJPEG_VERSION}/ \

    && mkdir build && cd build \
    && cmake .. -DCMAKE_BUILD_TYPE=Release 
-DCMAKE_INSTALL_PREFIX=$ROOTDIR \

    && make && make install && make clean \
    && cd $ROOTDIR && rm -Rf src/openjpeg*

Any idea how that should change to apply your pull request so I can test 
that?


-Steve

$ docker run -v $(pwd):/data -t -i gdal2:local  /bin/bash
root@436a09289084:/usr/local# gdalinfo -stats /data/850011.jp2
Driver: JP2OpenJPEG/JPEG-2000 driver based on OpenJPEG library
Files: /data/850011.jp2
   /data/850011.jp2.aux.xml
Size is 29364, 25856
Coordinate System is:
GEOGCS["WGS 84",
    DATUM["WGS_1984",
    SPHEROID["WGS 84",6378137,298.257223563,
    AUTHORITY["EPSG","7030"]],
    AUTHORITY["EPSG","6326"]],
    PRIMEM["Greenwich",0],
    UNIT["degree",0.0174532925199433],
    AUTHORITY["EPSG","4326"]]
Origin = (50.811767578125000,18.406654712112974)
Pixel Size = (0.20390753690,-0.20390753690)
Image Structure Metadata:
  INTERLEAVE=PIXEL
Corner Coordinates:
Upper Left  (  50.8117676,  18.4066547) ( 50d48'42.36"E, 18d24'23.96"N)
Lower Left  (  50.8117676,  17.8794314) ( 50d48'42.36"E, 17d52'45.95"N)
Upper Right (  51.4105217,  18.4066547) ( 51d24'37.88"E, 18d24'23.96"N)
Lower Right (  51.4105217,  17.8794314) ( 51d24'37.88"E, 17d52'45.95"N)
Center  (  51.446,  18.1430430) ( 51d 6'40.12"E, 18d 8'34.95"N)
Band 1 Block=1024x1024 Type=Byte, ColorInterp=Red
  Min=122.000 Max=254.000
  Minimum=122.000, Maximum=254.000, Mean=204.487, StdDev=9.596
  Overviews: 14682x12928, 7341x6464, 3670x3232, 1835x1616, 917x808
  Overviews: arbitrary
  Metadata:
    STATISTICS_MAXIMUM=254
    STATISTICS_MEAN=204.48710711705
    STATISTICS_MINIMUM=122
    STATISTICS_STDDEV=9.596017845156
  Image Structure Metadata:
    COMPRESSION=JPEG2000
Band 2 Block=1024x1024 Type=Byte, ColorInterp=Green
  Min=98.000 Max=239.000
  Minimum=98.000, Maximum=239.000, Mean=178.492, StdDev=10.197
  Overviews: 14682x12928, 7341x6464, 3670x3232, 1835x1616, 917x808
  Overviews: arbitrary
  Metadata:
    STATISTICS_MAXIMUM=239
    STATISTICS_MEAN=178.49220443461
    STATISTICS_MINIMUM=98
    STATISTICS_STDDEV=10.197171404633
  Image Structure Metadata:
    COMPRESSION=JPEG2000
Band 3 Block=1024x1024 Type=Byte, ColorInterp=Blue
  Min=63.000 Max=215.000
  Minimum=63.000, Maximum=215.000, Mean=145.743, StdDev=14.060
  Overviews: 14682x12928, 7341x6464, 3670x3232, 1835x1616, 917x808
  Overviews: arbitrary
  Metadata:
    STATISTICS_MAXIMUM=215
    STATISTICS_MEAN=145.74315841763
    STATISTICS_MINIMUM=63
    STATISTICS_STDDEV=14.060045889747
  Image Structure Metadata:
    COMPRESSION=JPEG2000
root@436a09289084:/usr/local# gdal_translate -of GTiff /data/850011.jp2 
/data/850011.tif

Input file size is 29364, 25856
0ERROR 1: psImage->comps[0].data == NULL
ERROR 1: /data/850011.jp2, band 1: IReadBlock failed at X offset 0, Y 
offset 0

ERROR 1: GetBlockRef failed at X block offset 0, Y block offset 0

root@436a09289084:/usr/local# gdal_translate -of GTiff -co TILED=YES 
/data/850011.jp2 /data/850011.tif

Input file size is 29364, 25856
0ERROR 1: psImage->comps[0].data == NULL
ERROR 1: /data/850011.jp2, band 1: IReadBlock failed at X offset 0, Y 
offset 0

ERROR 1: GetBlockRef failed at X block offset 0, Y block offset 0

On 6/20/2018 10:27 AM, Even Rouault wrote:

On mardi 19 juin 2018 23:07:24 CEST Stephen Woodbridge wrote:

I tried the following:

git clone https://github.com/GeographicaGS/Docker-GDAL2.git
cd Docker-GDAL2/2.2.4
#Edit Dockerfile and set 'ENV OPENJPEG_VERSION 2.3.0'
docker build -t gdal2:local ./
cd /path/to/jp2
docker run -t -i gdal2:local -v $(pwd):/data /bin/bash
gdalinfo -stats /data/850011.jp2

And got the same errors.

Even, I sent you a link to download the file from my dropbox.

Steve,

I've investigated and those JPEG2000 files have a particular characteristic
that triggered a regression dating back to openjpeg 2.2.0

I've just fixed it in openjpeg master per
https://github.com/uclouvain/openjpeg/pull/1121

So I'd suggest you to track openjpeg master (or revert back to 2.1.2, but
you'll lo

Re: [gdal-dev] Having trouble working with jp2 files

2018-06-19 Thread Stephen Woodbridge

I tried the following:

git clone https://github.com/GeographicaGS/Docker-GDAL2.git
cd Docker-GDAL2/2.2.4
#Edit Dockerfile and set 'ENV OPENJPEG_VERSION 2.3.0'
docker build -t gdal2:local ./
cd /path/to/jp2
docker run -t -i gdal2:local -v $(pwd):/data /bin/bash
gdalinfo -stats /data/850011.jp2

And got the same errors.

Even, I sent you a link to download the file from my dropbox.

-Steve

On 6/19/2018 5:11 PM, Even Rouault wrote:

On mardi 19 juin 2018 16:57:30 CEST Stephen Woodbridge wrote:

Hi all,

I'm having a problem readding jp2 files. My goal is to be able to
display them through mapserver, but I'm have problems with gdalinfo and
gdal_translate. I set up a docker image with the latest gdal2 in it like
the following.  I have 1074 jp2 files and all that I have tried are
throwing the same errors as below. Any ideas on what is going on?

Stephen,

I see that
https://github.com/GeographicaGS/Docker-GDAL2/blob/master/2.3.0/Dockerfile
uses openjpeg 2.2.0, and not the latest openjpeg 2.3.0
As your images are large, in case they are single-tiled, it might be a failed
allocation in openjpeg 2.2. Upgrading to openjpeg 2.3.0 might help

Any link to one of those images ?

Even




---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] Having trouble working with jp2 files

2018-06-19 Thread Stephen Woodbridge

Hi all,

I'm having a problem readding jp2 files. My goal is to be able to 
display them through mapserver, but I'm have problems with gdalinfo and 
gdal_translate. I set up a docker image with the latest gdal2 in it like 
the following.  I have 1074 jp2 files and all that I have tried are 
throwing the same errors as below. Any ideas on what is going on?


-Steve

$ docker pull geographica/gdal2:latest
# cd to where 850011.* exists
# this runs the container and mounts pwd on /data in the container
$ docker run -t -i -v $(pwd):/data geographica/gdal2 /bin/bash

root@2730a320bc09:/usr/local# gdalinfo /data/850011.jp2
Driver: JP2OpenJPEG/JPEG-2000 driver based on OpenJPEG library
Files: /data/850011.jp2
Size is 29364, 25856
Coordinate System is:
GEOGCS["WGS 84",
    DATUM["WGS_1984",
    SPHEROID["WGS 84",6378137,298.257223563,
    AUTHORITY["EPSG","7030"]],
    AUTHORITY["EPSG","6326"]],
    PRIMEM["Greenwich",0],
    UNIT["degree",0.0174532925199433],
    AUTHORITY["EPSG","4326"]]
Origin = (50.811767578125000,18.406654712112974)
Pixel Size = (0.20390753690,-0.20390753690)
Image Structure Metadata:
  INTERLEAVE=PIXEL
Corner Coordinates:
Upper Left  (  50.8117676,  18.4066547) ( 50d48'42.36"E, 18d24'23.96"N)
Lower Left  (  50.8117676,  17.8794314) ( 50d48'42.36"E, 17d52'45.95"N)
Upper Right (  51.4105217,  18.4066547) ( 51d24'37.88"E, 18d24'23.96"N)
Lower Right (  51.4105217,  17.8794314) ( 51d24'37.88"E, 17d52'45.95"N)
Center  (  51.446,  18.1430430) ( 51d 6'40.12"E, 18d 8'34.95"N)
Band 1 Block=1024x1024 Type=Byte, ColorInterp=Red
  Overviews: 14682x12928, 7341x6464, 3670x3232, 1835x1616, 917x808
  Overviews: arbitrary
  Image Structure Metadata:
    COMPRESSION=JPEG2000
Band 2 Block=1024x1024 Type=Byte, ColorInterp=Green
  Overviews: 14682x12928, 7341x6464, 3670x3232, 1835x1616, 917x808
  Overviews: arbitrary
  Image Structure Metadata:
    COMPRESSION=JPEG2000
Band 3 Block=1024x1024 Type=Byte, ColorInterp=Blue
  Overviews: 14682x12928, 7341x6464, 3670x3232, 1835x1616, 917x808
  Overviews: arbitrary
  Image Structure Metadata:
    COMPRESSION=JPEG2000

root@2730a320bc09:/usr/local# gdalinfo -stats /data/850011.jp2
Driver: JP2OpenJPEG/JPEG-2000 driver based on OpenJPEG library
Files: /data/850011.jp2
Size is 29364, 25856
Coordinate System is:
GEOGCS["WGS 84",
    DATUM["WGS_1984",
    SPHEROID["WGS 84",6378137,298.257223563,
    AUTHORITY["EPSG","7030"]],
    AUTHORITY["EPSG","6326"]],
    PRIMEM["Greenwich",0],
    UNIT["degree",0.0174532925199433],
    AUTHORITY["EPSG","4326"]]
Origin = (50.811767578125000,18.406654712112974)
Pixel Size = (0.20390753690,-0.20390753690)
Image Structure Metadata:
  INTERLEAVE=PIXEL
Corner Coordinates:
Upper Left  (  50.8117676,  18.4066547) ( 50d48'42.36"E, 18d24'23.96"N)
Lower Left  (  50.8117676,  17.8794314) ( 50d48'42.36"E, 17d52'45.95"N)
Upper Right (  51.4105217,  18.4066547) ( 51d24'37.88"E, 18d24'23.96"N)
Lower Right (  51.4105217,  17.8794314) ( 51d24'37.88"E, 17d52'45.95"N)
Center  (  51.446,  18.1430430) ( 51d 6'40.12"E, 18d 8'34.95"N)
Band 1 Block=1024x1024 Type=Byte, ColorInterp=Red
ERROR 1: psImage->comps[0].data == nullptr
ERROR 1: /data/850011.jp2, band 1: IReadBlock failed at X offset 0, Y 
offset 0: psImage->comps[0].data == nullptr

  Overviews: 14682x12928, 7341x6464, 3670x3232, 1835x1616, 917x808
  Overviews: arbitrary
  Image Structure Metadata:
    COMPRESSION=JPEG2000
Band 2 Block=1024x1024 Type=Byte, ColorInterp=Green
ERROR 1: psImage->comps[0].data == nullptr
ERROR 1: /data/850011.jp2, band 2: IReadBlock failed at X offset 0, Y 
offset 0: psImage->comps[0].data == nullptr

  Overviews: 14682x12928, 7341x6464, 3670x3232, 1835x1616, 917x808
  Overviews: arbitrary
  Image Structure Metadata:
    COMPRESSION=JPEG2000
Band 3 Block=1024x1024 Type=Byte, ColorInterp=Blue
ERROR 1: psImage->comps[0].data == nullptr
ERROR 1: /data/850011.jp2, band 3: IReadBlock failed at X offset 0, Y 
offset 0: psImage->comps[0].data == nullptr

  Overviews: 14682x12928, 7341x6464, 3670x3232, 1835x1616, 917x808
  Overviews: arbitrary
  Image Structure Metadata:
    COMPRESSION=JPEG2000

root@2730a320bc09:/usr/local# gdal_translate -of GTiff /data/850011.jp2 
/data/850011.tif

Input file size is 29364, 25856
0ERROR 1: psImage->comps[0].data == nullptr
ERROR 1: /data/850011.jp2, band 1: IReadBlock failed at X offset 0, Y 
offset 0: psImage->comps[0].data == nullptr


root@2730a320bc09:/usr/local#


---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] First attempt to use raster - need help

2018-01-13 Thread Stephen Woodbridge

Hi all,

I'm trying to use the raster functions for the first time. I have a 
simple black and white image with some polygons drawn on it. and I want 
to extract the polygons as geometry.


$ gdalinfo -hist 12322240_lines.png
Driver: PNG/Portable Network Graphics
Files: 12322240_lines.png
   12322240_lines.png.aux.xml
Size is 600, 800
Coordinate System is `'
Corner Coordinates:
Upper Left  (    0.0,    0.0)
Lower Left  (    0.0,  800.0)
Upper Right (  600.0,    0.0)
Lower Right (  600.0,  800.0)
Center  (  300.0,  400.0)
Band 1 Block=600x1 Type=Byte, ColorInterp=Gray
  256 buckets from -0.5 to 255.5:
  474137 5863 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 
0 0 0 0 0 0 0 0 0

  Image Structure Metadata:
    NBITS=1

the background is the 474137bucket and the line work is the 5863 bucket.

$ raster2pgsql -t auto -P -c -f img -F -n file -I -M *.png | 
/usr/lib/postgresql/9.5/bin/psql -U postgres -h localhost -p 5435 
test_sketch


I'm loading 2 PNG images in this example, about ultimately I will have 
100-1000s of images to process and extract the line work.


SELECT (md).*, (bmd).*
 FROM (SELECT ST_Metadata(img) AS md,
  ST_BandMetadata(img) AS bmd
   FROM "12322240_lines" LIMIT 1
  ) foo;

-- 0;0;30;32;1;-1;0;0;0;1;"8BUI";;f;""

select rid, val, st_astext(geom) as wkt
from (
select rid, dp.*
from "12322240_lines", lateral st_dumpaspolygons(img, 1, true) as dp
) as foo;

This generates 1349 rows. So it appears to be processing these an a tile 
by tile basis not on the whole image. How do I get the polygons that are 
represented by the sketch. Assuming that the sketch lines are 1 or 2 
pixels wide, I presume that I would get the polygon around those pixels. 
Do I need to then skeletonize that to get the original lines?


select * from "12322240_lines";

generates 100 rows.


Any thoughts on how to best approach this would be appreciated.

Thanks,

  -Steve


---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] ogr2ogr PROMOTE_TO_MULTI with linestrings?

2017-05-16 Thread Stephen Woodbridge
I'm just trying to load the census roads data. I think what happened is 
that when the table was created using -nlt PROMOTE_TO_MULTI it was 
created as polygon geometry. then the subsequent insert failed. I change 
the script to -nlt MULTILINESTRING and it loaded all the data. I found a 
post from Even circa 2012 IIRC that indicated PROMOTE_TO_MULTI should 
work for both polygons and linestrings.


I just checked the log file and the table was created with:

ogr2ogr -t_srs EPSG:4326 -nln census.roads -nlt PROMOTE_TO_MULTI -f 
PostgreSQL -overwrite -lco OVERWRITE=YES -lco PRECISION=NO -lco 
GEOMETRY_NAME=geom -lco FID=gid "PG:dbname=buildings host=localhost 
port=5432 user=postgres active_schema=census" 
/u/ror/buildings/data/census/tmp-11/tl_2016_us_county.shp


and then failed on the first insert:

ogr2ogr -t_srs EPSG:4326 -nln census.roads -nlt PROMOTE_TO_MULTI -f 
PostgreSQL -append "PG:dbname=buildings host=localhost port=5432 
user=postgres active_schema=census" 
/u/ror/buildings/data/census/tmp-11/tl_2016_06001_roads.shp
Warning 1: Geometry to be inserted is of type Multi Line String, whereas 
the layer geometry type is Multi Polygon.

Insertion is likely to fail
ERROR 1: ERROR:  Geometry type (MultiLineString) does not match column 
type (MultiPolygon)


ERROR 1: INSERT command for new feature failed.

When I changed, PROMOTE_TO_MULTI to MULTILINESTRING, everything worked 
fine. I only loaded one state (CA), but I do know for a fact that some 
of the Census roads data has a mix of Linestring and MultiLinestring 
features, which is why I used PROMOTE_TO_MULTI in the first place.


-Steve

On 5/16/2017 2:58 AM, jratike80 wrote:

Stephen Woodbridge wrote

On 5/15/2017 6:15 PM, Stephen Woodbridge wrote:

Hi,

I'm trying to load both polygons and linestrings and would like
PROMOTE_TO_MULTI to work for both, but it appears to define Multipolygon
type and does not work for linestrings/multilinestrings.

Maybe it would make more sense to have:

PROMOTE_TO_MULTI - work for either polygon or linestring depending on
what the first object is

or have two options like:

PROMOTE_POLY_TO_MULTI
PROMOTE_LINE_TO_MULTI

Or maybe, I'm totally missing something in the docs.


Sorry, forgot to add:

ogrinfo --version
GDAL 2.1.0, released 2016/04/25

ogr2ogr -t_srs EPSG:4326 -nln census.roads -nlt PROMOTE_TO_MULTI -f
PostgreSQL -append PG:dbname=buildings host=localhost port=5432
user=postgres active_schema=census
/u/ror/buildings/data/census/tmp-11/tl_2016_06061_roads.shp
Warning 1: Geometry to be inserted is of type Multi Line String, whereas
the layer geometry type is Multi Polygon.
Insertion is likely to fail
ERROR 1: ERROR:  Geometry type (MultiLineString) does not match column
type (MultiPolygon)


Hi,

Please tell more details about what you plan to do. Shapefile can't contain
a mixture of lines and polygons so the error you get must mean that PostGIS
table "census.roads" already exists and it has been created as MultiPolygon.
Did you convert some polygons from another shapefile into the same table
before? Anyway the restriction is at this stage set by PostGIS which
requires multipolygons and you just can't  convert your lines into
multipolygons.

Drop your table and start again by using -nlt GEOMETRY and you should be
fine. That makes the PostGIS table to accept any kind of geometries. If you
definitely need a mixture of MULTIpolygons and MULTIlinestrins we must add
casts somewhere but perhaps that is not what you really want.

-Jukka Rahkonen-






--
View this message in context: 
http://osgeo-org.1560.x6.nabble.com/gdal-dev-ogr2ogr-PROMOTE-TO-MULTI-with-linestrings-tp5320602p5320626.html
Sent from the GDAL - Dev mailing list archive at Nabble.com.
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev




---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] ogr2ogr PROMOTE_TO_MULTI with linestrings?

2017-05-15 Thread Stephen Woodbridge

On 5/15/2017 6:15 PM, Stephen Woodbridge wrote:

Hi,

I'm trying to load both polygons and linestrings and would like 
PROMOTE_TO_MULTI to work for both, but it appears to define Multipolygon 
type and does not work for linestrings/multilinestrings.


Maybe it would make more sense to have:

PROMOTE_TO_MULTI - work for either polygon or linestring depending on 
what the first object is


or have two options like:

PROMOTE_POLY_TO_MULTI
PROMOTE_LINE_TO_MULTI

Or maybe, I'm totally missing something in the docs.


Sorry, forgot to add:

ogrinfo --version
GDAL 2.1.0, released 2016/04/25

ogr2ogr -t_srs EPSG:4326 -nln census.roads -nlt PROMOTE_TO_MULTI -f 
PostgreSQL -append PG:dbname=buildings host=localhost port=5432 
user=postgres active_schema=census 
/u/ror/buildings/data/census/tmp-11/tl_2016_06061_roads.shp
Warning 1: Geometry to be inserted is of type Multi Line String, whereas 
the layer geometry type is Multi Polygon.

Insertion is likely to fail
ERROR 1: ERROR:  Geometry type (MultiLineString) does not match column 
type (MultiPolygon)



---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] ogr2ogr PROMOTE_TO_MULTI with linestrings?

2017-05-15 Thread Stephen Woodbridge

Hi,

I'm trying to load both polygons and linestrings and would like 
PROMOTE_TO_MULTI to work for both, but it appears to define Multipolygon 
type and does not work for linestrings/multilinestrings.


Maybe it would make more sense to have:

PROMOTE_TO_MULTI - work for either polygon or linestring depending on 
what the first object is


or have two options like:

PROMOTE_POLY_TO_MULTI
PROMOTE_LINE_TO_MULTI

Or maybe, I'm totally missing something in the docs.

-Steve

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] A couple of new issues with gdal

2017-04-14 Thread Stephen Woodbridge

Even,

http://imaptools.com:8080/dl/sew-tmp.tbz

See the README.txt file for an explanation. There is a run.log that 
shows what gdal commands were run and various intermediate tif/vrt 
files. I chopped out a 900x900 image from a full DOQQ, which is also 
included.


For some context, I building a workflow that will be able to fetch 
DOQQs, prep them, segmentize them, and then either train or search for 
objects using scikit machine learning. This particular part of the 
workflow is in the prep stage. A lot of the workflow is being built 
around the ability to manage large datasets of imagery. For example 
California has something like 10-11,000 DOQQs.


Anyway, let me know if you need more info.

Thanks,
  -Steve

On 4/14/2017 2:06 PM, Even Rouault wrote:

On vendredi 14 avril 2017 13:39:12 CEST Stephen Woodbridge wrote:

 > Hi Even,

 >

 > GDAL 2.1.0, released 2016/04/25

 >

 > I'm pretty happy with the new workflow so far, but have te following

 > minor issues.

 >

 > The color interp is still problematic with gdalinfo, even though the

 > *.aux.xml file looks correct, except it is missing the Alpha band. I

 > generated the final tif using a VRT that defined the the color interp.

 >

 > The second issue is that I'm getting a lot of:

 > Warning 1: JPEGLib:Premature end of JPEG file

 > messages when I run gdaladdo

Can you provide a fully reproducable (ie all input files and scripts) 
procedure so that others can try easily ? Ideally with smallish files, etc


 >

 > When I try to display it with imagemagick it complains that it is

 > PhotometricInterpretation=4, (ie: a Transparency Mask) which seems to

 > make sense since I asked for a mask.

 >

 > -Steve

 >

 > I've reduced my processing to the following steps:

 >

 > 1. create a sobel image from the source tif with the same georeferencing

 > 2. gdalwarp it to EPSG:4326 in tempfile2

 >

 > 3. gdalwarp source tif to EPSG:4326 in tempfile1

 > gdalwarp -t_srs EPSG:4326 -dstalpha -co TILED=YES \

 > src/33118/m_3311814_nw_11_1_20140513.tif \

 > tmp/116-1-m_3311814_nw_11_1_20140513.tif

 >

 > 4. generate a vrt with

 > # a band = [filename, src_band_no, color_interp]

 > bands = [ [tempfile1, 1, 'Red'],

 > [tempfile1, 2, 'Green'],

 > [tempfile1, 3, 'Blue'],

 > [tempfile1, 4, 'Gray'], # IR band

 > [tempfile2, 1, 'Gray'], # Sobel band to add

 > [tempfile1, 5, 'Alpha'] ] # Alpha band

 >

 > and each VRTRasterBand looks like:

 > '''

 > 

 > {2}

 > 

 > {3}

 > {4:d}

 > 

 > 

 > 

 > '''

 >

 > 5. translate vrt to tiff and compress it

 > gdal_translate -co TILED=YES -co JPEG_QUALITY=90 -co COMPRESS=JPEG -co

 > INTERLEAVE=BAND -mask 6 --config GDAL_TIFF_INTERNAL_MASK YES \

 > tmp/116-2-m_3311814_nw_11_1_20140513.vrt \

 > dest/m_3311814_nw_11_1_20140513.tif

 >

 > $ gdalinfo dest/m_3311814_nw_11_1_20140513.tif

 > Driver: GTiff/GeoTIFF

 > Files: dest/m_3311805_ne_11_1_20140513.tif

 > dest/m_3311805_ne_11_1_20140513.tif.aux.xml

 > Size is 7232, 7056

 > Coordinate System is:

 > GEOGCS["WGS 84",

 > DATUM["WGS_1984",

 > SPHEROID["WGS 84",6378137,298.257223563,

 > AUTHORITY["EPSG","7030"]],

 > AUTHORITY["EPSG","6326"]],

 > PRIMEM["Greenwich",0],

 > UNIT["degree",0.0174532925199433],

 > AUTHORITY["EPSG","4326"]]

 > Origin = (-118.441851318576212,34.003461706049677)

 > Pixel Size = (0.09839810447,-0.09839810447)

 > Metadata:

 > AREA_OR_POINT=Area

 > Image Structure Metadata:

 > COMPRESSION=JPEG

 > INTERLEAVE=BAND

 > Corner Coordinates:

 > Upper Left (-118.4418513, 34.0034617) (118d26'30.66"W, 34d 0'12.46"N)

 > Lower Left (-118.4418513, 33.9340320) (118d26'30.66"W, 33d56' 2.52"N)

 > Upper Right (-118.3706898, 34.0034617) (118d22'14.48"W, 34d 0'12.46"N)

 > Lower Right (-118.3706898, 33.9340320) (118d22'14.48"W, 33d56' 2.52"N)

 > Center (-118.4062706, 33.9687469) (118d24'22.57"W, 33d58' 7.49"N)

 > Band 1 Block=256x256 Type=Byte, ColorInterp=Gray

 > Overviews: 3616x3528, 1808x1764, 904x882, 452x441, 226x221, 113x111,

 > 57x56

 > Mask Flags: PER_DATASET

 > Overviews of mask band: 3616x3528, 1808x1764, 904x882, 452x441,

 > 226x221, 113x111, 57x56

 > Band 2 Block=256x256 Type=Byte, ColorInterp=Undefined

 > Overviews: 3616x3528, 1808x1764, 904x882, 452x441, 226x221, 113x111,

 > 57x56

 > Mask Flags: PER_DATASET

 > Overviews of mask band: 3616x3528, 1808x1764, 904x882, 452x441,

 > 226x221, 113x111, 57x56

 > Band 3 Block=256x256 Type=Byte, ColorInterp=Undefined

 > Overviews: 3616x3528, 1808x1764, 904x882, 452x441, 226x221, 113x111,

 > 57x56

[gdal-dev] A couple of new issues with gdal

2017-04-14 Thread Stephen Woodbridge

Hi Even,

GDAL 2.1.0, released 2016/04/25

I'm pretty happy with the new workflow so far, but have te following 
minor issues.


The color interp is still problematic with gdalinfo, even though the 
*.aux.xml file looks correct, except it is missing the Alpha band. I 
generated the final tif using a VRT that defined the the color interp.


The second issue is that I'm getting a lot of:
   Warning 1: JPEGLib:Premature end of JPEG file
messages when I run gdaladdo

When I try to display it with imagemagick it complains that it is 
PhotometricInterpretation=4, (ie: a Transparency Mask) which seems to 
make sense since I asked for a mask.


-Steve

I've reduced my processing to the following steps:

1. create a sobel image from the source tif with the same georeferencing
2. gdalwarp it to EPSG:4326 in tempfile2

3. gdalwarp source tif to EPSG:4326 in tempfile1
gdalwarp -t_srs EPSG:4326 -dstalpha -co TILED=YES \
src/33118/m_3311814_nw_11_1_20140513.tif \
tmp/116-1-m_3311814_nw_11_1_20140513.tif

4. generate a vrt with
# a band = [filename, src_band_no, color_interp]
bands = [ [tempfile1, 1, 'Red'],
  [tempfile1, 2, 'Green'],
  [tempfile1, 3, 'Blue'],
  [tempfile1, 4, 'Gray'],   # IR band
  [tempfile2, 1, 'Gray'],   # Sobel band to add
  [tempfile1, 5, 'Alpha'] ] # Alpha band

and each VRTRasterBand looks like:
'''
  
{2}

{3}
{4:d}



  '''

5. translate vrt to tiff and compress it
gdal_translate -co TILED=YES -co JPEG_QUALITY=90 -co COMPRESS=JPEG -co 
INTERLEAVE=BAND -mask 6 --config GDAL_TIFF_INTERNAL_MASK YES \

tmp/116-2-m_3311814_nw_11_1_20140513.vrt \
dest/m_3311814_nw_11_1_20140513.tif

$ gdalinfo dest/m_3311814_nw_11_1_20140513.tif
Driver: GTiff/GeoTIFF
Files: dest/m_3311805_ne_11_1_20140513.tif
   dest/m_3311805_ne_11_1_20140513.tif.aux.xml
Size is 7232, 7056
Coordinate System is:
GEOGCS["WGS 84",
DATUM["WGS_1984",
SPHEROID["WGS 84",6378137,298.257223563,
AUTHORITY["EPSG","7030"]],
AUTHORITY["EPSG","6326"]],
PRIMEM["Greenwich",0],
UNIT["degree",0.0174532925199433],
AUTHORITY["EPSG","4326"]]
Origin = (-118.441851318576212,34.003461706049677)
Pixel Size = (0.09839810447,-0.09839810447)
Metadata:
  AREA_OR_POINT=Area
Image Structure Metadata:
  COMPRESSION=JPEG
  INTERLEAVE=BAND
Corner Coordinates:
Upper Left  (-118.4418513,  34.0034617) (118d26'30.66"W, 34d 0'12.46"N)
Lower Left  (-118.4418513,  33.9340320) (118d26'30.66"W, 33d56' 2.52"N)
Upper Right (-118.3706898,  34.0034617) (118d22'14.48"W, 34d 0'12.46"N)
Lower Right (-118.3706898,  33.9340320) (118d22'14.48"W, 33d56' 2.52"N)
Center  (-118.4062706,  33.9687469) (118d24'22.57"W, 33d58' 7.49"N)
Band 1 Block=256x256 Type=Byte, ColorInterp=Gray
  Overviews: 3616x3528, 1808x1764, 904x882, 452x441, 226x221, 113x111, 
57x56

  Mask Flags: PER_DATASET
  Overviews of mask band: 3616x3528, 1808x1764, 904x882, 452x441, 
226x221, 113x111, 57x56

Band 2 Block=256x256 Type=Byte, ColorInterp=Undefined
  Overviews: 3616x3528, 1808x1764, 904x882, 452x441, 226x221, 113x111, 
57x56

  Mask Flags: PER_DATASET
  Overviews of mask band: 3616x3528, 1808x1764, 904x882, 452x441, 
226x221, 113x111, 57x56

Band 3 Block=256x256 Type=Byte, ColorInterp=Undefined
  Overviews: 3616x3528, 1808x1764, 904x882, 452x441, 226x221, 113x111, 
57x56

  Mask Flags: PER_DATASET
  Overviews of mask band: 3616x3528, 1808x1764, 904x882, 452x441, 
226x221, 113x111, 57x56

Band 4 Block=256x256 Type=Byte, ColorInterp=Undefined
  Overviews: 3616x3528, 1808x1764, 904x882, 452x441, 226x221, 113x111, 
57x56

  Mask Flags: PER_DATASET
  Overviews of mask band: 3616x3528, 1808x1764, 904x882, 452x441, 
226x221, 113x111, 57x56

Band 5 Block=256x256 Type=Byte, ColorInterp=Undefined
  Overviews: 3616x3528, 1808x1764, 904x882, 452x441, 226x221, 113x111, 
57x56

  Mask Flags: PER_DATASET
  Overviews of mask band: 3616x3528, 1808x1764, 904x882, 452x441, 
226x221, 113x111, 57x56

Band 6 Block=256x256 Type=Byte, ColorInterp=Alpha
  Overviews: 3616x3528, 1808x1764, 904x882, 452x441, 226x221, 113x111, 
57x56

  Mask Flags: PER_DATASET
  Overviews of mask band: 3616x3528, 1808x1764, 904x882, 452x441, 
226x221, 113x111, 57x56



dest/m_3311805_ne_11_1_20140513.tif.aux.xml

  
Red
  
  
Green
  
  
Blue
  
  
Gray
  
  
Gray
  



6. add overviews to tiff
gdaladdo -clean -r average dest/33118/m_3311814_nw_11_1_20140513.tif
 2 4 8 16 32 64 128

I get lots of Warnings:
Warning 1: JPEGLib:Premature end of JPEG file

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] What is the best strategy for handling multiple banded tiffs

2017-04-13 Thread Stephen Woodbridge

Hi Even,

Thanks that helped a lot.

Here are the results I'm getting now in case they help you with color 
interp fixes. I'm not sure about the *.aux.xml files that get created 
but they do not seem to always represent all the bands that gdalinfo 
reports. In particular the alpha band does not show up in them, but that 
may be by design similar to the fact that Undefined bands also do not 
show up.


Anyway running through all these variations in options I helping me to 
get a better handle on how things work.


Your help is much appreciated and has gotten me over the immediate 
hurdles that were blocking me.


Thank you again,
  -Steve


gdalwarp -t_srs EPSG:4326 -dstalpha  -co TILED=YES 
m_3311805_ne_11_1_20140513.tif epsg4326.tif


> Band 1 Block=256x256 Type=Byte, ColorInterp=Gray
> Band 2 Block=256x256 Type=Byte, ColorInterp=Undefined
> Band 3 Block=256x256 Type=Byte, ColorInterp=Undefined
> Band 4 Block=256x256 Type=Byte, ColorInterp=Undefined
> Band 5 Block=256x256 Type=Byte, ColorInterp=Alpha

I created epsg4326.tif.aux.xml like:


  
Red
  
  
Green
  
  
Blue
  
  
Gray
  
  
Alpha
  


and now get bands defined like as expected (the gray band is the IR):

Band 1 Block=256x256 Type=Byte, ColorInterp=Red
Band 2 Block=256x256 Type=Byte, ColorInterp=Green
Band 3 Block=256x256 Type=Byte, ColorInterp=Blue
Band 4 Block=256x256 Type=Byte, ColorInterp=Gray
Band 5 Block=256x256 Type=Byte, ColorInterp=Alpha

gdal_translate -co TILED=YES -co JPEG_QUALITY=90 -co COMPRESS=JPEG -b 1 
-b 2 -b 3 -b 4 -mask 5 epsg4326.tif out.tif


Files: out.tif
   out.tif.msk
   out.tif.aux.xml

Band 1 Block=256x256 Type=Byte, ColorInterp=Red
  Mask Flags: PER_DATASET
Band 2 Block=256x256 Type=Byte, ColorInterp=Green
  Mask Flags: PER_DATASET
Band 3 Block=256x256 Type=Byte, ColorInterp=Blue
  Mask Flags: PER_DATASET
Band 4 Block=256x256 Type=Byte, ColorInterp=Gray
  Mask Flags: PER_DATASET

gdal_translate -co TILED=YES -co JPEG_QUALITY=90 -co COMPRESS=JPEG -co 
INTERLEAVE=BAND epsg4326.tif out2.tif


Files: out2.tif
   out2.tif.aux.xml

Band 1 Block=256x256 Type=Byte, ColorInterp=Red
Band 2 Block=256x256 Type=Byte, ColorInterp=Green
Band 3 Block=256x256 Type=Byte, ColorInterp=Blue
Band 4 Block=256x256 Type=Byte, ColorInterp=Gray
Band 5 Block=256x256 Type=Byte, ColorInterp=Alpha

with .aux.xml like:


  
Red
  
  
Green
  
  
Blue
  
  
Gray
  


gdal_translate -co TILED=YES -co JPEG_QUALITY=90 -co COMPRESS=JPEG -co 
INTERLEAVE=BAND --config GDAL_TIFF_INTERNAL_MASK YES epsg4326.tif out3.tif


Files: out3.tif
   out3.tif.aux.xml

Band 1 Block=256x256 Type=Byte, ColorInterp=Red
Band 2 Block=256x256 Type=Byte, ColorInterp=Green
Band 3 Block=256x256 Type=Byte, ColorInterp=Blue
Band 4 Block=256x256 Type=Byte, ColorInterp=Gray
Band 5 Block=256x256 Type=Byte, ColorInterp=Alpha

And the .aux.xml is the same as above.

gdal_translate -co TILED=YES -co JPEG_QUALITY=90 -co COMPRESS=JPEG -co 
INTERLEAVE=BAND -mask 5 --config GDAL_TIFF_INTERNAL_MASK YES 
epsg4326.tif out4.tif


Files: out4.tif
   out4.tif.aux.xml

Band 1 Block=256x256 Type=Byte, ColorInterp=Red
  Mask Flags: PER_DATASET
Band 2 Block=256x256 Type=Byte, ColorInterp=Green
  Mask Flags: PER_DATASET
Band 3 Block=256x256 Type=Byte, ColorInterp=Blue
  Mask Flags: PER_DATASET
Band 4 Block=256x256 Type=Byte, ColorInterp=Gray
  Mask Flags: PER_DATASET
Band 5 Block=256x256 Type=Byte, ColorInterp=Alpha
  Mask Flags: PER_DATASET

And the .aux.xml is the same as above.



On 4/12/2017 5:40 PM, Stephen Woodbridge wrote:

I'm working with:

GDAL 1.10.1, released 2013/08/26 (native)
GDAL 2.1.0, released 2016/04/25  (in a docker container)

All these results are from 2.1.

I don't think I need the nearblack call as the NAIP imagery does not 
appear to have collars. So I'm removing it from the workflow for now.


The source DOQQ has these bands where band 4 is the IR:
Band 1 Block=6472x1 Type=Byte, ColorInterp=Red
Band 2 Block=6472x1 Type=Byte, ColorInterp=Green
Band 3 Block=6472x1 Type=Byte, ColorInterp=Blue
Band 4 Block=6472x1 Type=Byte, ColorInterp=Undefined

gdalwarp -t_srs EPSG:4326 -dstalpha  -co TILED=YES 
m_3311805_ne_11_1_20140513.tif epsg4326.tif


Band 1 Block=256x256 Type=Byte, ColorInterp=Gray
Band 2 Block=256x256 Type=Byte, ColorInterp=Undefined
Band 3 Block=256x256 Type=Byte, ColorInterp=Undefined
Band 4 Block=256x256 Type=Byte, ColorInterp=Undefined
Band 5 Block=256x256 Type=Byte, ColorInterp=Alpha

So this is annoying, but probably fixable, band 4 is the IR if I add an 
*.aux.xml file setting the color interupt, at least on 2.1, adding it on 
1.10.1 does not help. It does not appear that it is possible to create a 
mask band only an alpha band at this point.


The compressing this:

gdal_translate -co TILED=YES -co JPEG_QUALITY=90 -co COMPRESS=JPEG -b 1 
-b 2 -b 3 -b 4 -mask 5 epsg4326.tif out.tif


Band 1 Block=256x256 Type=Byte, ColorInterp=Gray
   Mask Flags: PER_DATASET
Band 2 Block=256x256 Type=By

Re: [gdal-dev] What is the best strategy for handling multiple banded tiffs

2017-04-12 Thread Stephen Woodbridge

I'm working with:

GDAL 1.10.1, released 2013/08/26 (native)
GDAL 2.1.0, released 2016/04/25  (in a docker container)

All these results are from 2.1.

I don't think I need the nearblack call as the NAIP imagery does not 
appear to have collars. So I'm removing it from the workflow for now.


The source DOQQ has these bands where band 4 is the IR:
Band 1 Block=6472x1 Type=Byte, ColorInterp=Red
Band 2 Block=6472x1 Type=Byte, ColorInterp=Green
Band 3 Block=6472x1 Type=Byte, ColorInterp=Blue
Band 4 Block=6472x1 Type=Byte, ColorInterp=Undefined

gdalwarp -t_srs EPSG:4326 -dstalpha  -co TILED=YES 
m_3311805_ne_11_1_20140513.tif epsg4326.tif


Band 1 Block=256x256 Type=Byte, ColorInterp=Gray
Band 2 Block=256x256 Type=Byte, ColorInterp=Undefined
Band 3 Block=256x256 Type=Byte, ColorInterp=Undefined
Band 4 Block=256x256 Type=Byte, ColorInterp=Undefined
Band 5 Block=256x256 Type=Byte, ColorInterp=Alpha

So this is annoying, but probably fixable, band 4 is the IR if I add an 
*.aux.xml file setting the color interupt, at least on 2.1, adding it on 
1.10.1 does not help. It does not appear that it is possible to create a 
mask band only an alpha band at this point.


The compressing this:

gdal_translate -co TILED=YES -co JPEG_QUALITY=90 -co COMPRESS=JPEG -b 1 
-b 2 -b 3 -b 4 -mask 5 epsg4326.tif out.tif


Band 1 Block=256x256 Type=Byte, ColorInterp=Gray
  Mask Flags: PER_DATASET
Band 2 Block=256x256 Type=Byte, ColorInterp=Green
  Mask Flags: PER_DATASET
Band 3 Block=256x256 Type=Byte, ColorInterp=Blue
  Mask Flags: PER_DATASET
Band 4 Block=256x256 Type=Byte, ColorInterp=Alpha
  Mask Flags: PER_DATASET

It dropped the source band 4 (IR)

gdal_translate -co TILED=YES -co JPEG_QUALITY=90 -co COMPRESS=JPEG 
epsg4326.tif out2.tif


Input file size is 7232, 7056
0...10...20...30...40...50...60...70...80...90...100 - done.
ERROR 1: JPEGLib:Too many color components: 5, max 4
ERROR 1: WriteEncodedTile/Strip() failed.
ERROR 1: JPEGLib:Too many color components: 5, max 4
ERROR 1: WriteEncodedTile/Strip() failed.
ERROR 1: JPEGLib:Too many color components: 5, max 4
ERROR 1: WriteEncodedTile/Strip() failed.

So is it possible to compress multiple bands using jpeg compression? 
(other than an RGB or RGBA or RGB mask?


So, if I want to end up with a single tif file with bands (R, G, B, 
Alpha|mask, IR, sobel) that is jpeg compressed is this possible?


What is the workflow to do that?

Or is the best I can do is to create separate RGBA file, an IR file and 
a sobel file and bring them together via a VRT file?


Thanks,
  -Steve


On 4/12/2017 2:01 PM, Even Rouault wrote:

On mercredi 12 avril 2017 13:23:06 CEST Stephen Woodbridge wrote:

 > Hi all,

 >

 > I'm reworking my code dealing with NAIP imagery that has R, G, B, IR
 > bands, and I generate a mask band when reprojecting it. I also have the
 > option to generate another computed band based on a sobel operator.
 >
 > In the past, I separated them R, G, B, mask into jpeg ycbcr compressed
 > tiff and had separate files for the IR, and the sobel data and used a
 > VRT file to pull all these together into a single image for additional
 > processing.
 >
 > Ideally, I would like to have one file, rather than 4 files (RGB, IR,
 > sobel, and the VRT) because over large areas the management of all these
 > files is a pain. Also when displaying and working with large areas I
 > need to mosaic the images into a seemless area using a vrt file or
 > tileindex.
 >
 > I need a way to display the various bands, and mapserver is my tool of
 > choice, but I probably need to be able to display them via qgis also,
 > which I have not experience with but reading older posts, I might need
 > to create a vrt file. In mapserver I can select bands using PROCESSING
 > options on layers.
 >
 > Questions?
 >
 > 1. Does it make sense to try and do this with one tiff with 6 bands?
 > 2. What would be the best workflow for doing this? I'm thinking
 > something like:
 >
 > # remove any collar
 > nearblack -co TILED=YES -of GTiff -nb 0 -near 0 -setmask -q -o temp1.tif
 > doqq.tif
 >
 > # warp to EPSG:4326 (does -dstalpha convert mask band to an alpha band?)

Yes

 > gdalwarp -t_srs EPSG:4326 -dstalpha -co TILED=YES temp1.tif rgba.tif
 >
 > # make sobel gray scale single band image
 > makesobel.py rgba.tif sobel.tif
 >
 > # create a temp.vrt file with bands R, G, B, Alpha|mask, IR, sobel
 >
 > # and finally build a tif using the temp.vrt
 > # I'm not sure what is the best way build the final tif given the mask
 > # and whether I should use gdalwarp or gdal_translate for this step
 > gdal_translate -co TILED=YES -co JPEG_QUALITY=90 -co COMPRESS=JPEG
 > --CONFIG GDAL_TIFF_INTERNAL_MASK YES temp.vrt final.tif

--> This will only work as intended if there's a mask band identified as 
a mask and not an alpha band in the VRT.


 >
 > # and build overviews
 > 

[gdal-dev] What is the best strategy for handling multiple banded tiffs

2017-04-12 Thread Stephen Woodbridge

Hi all,

I'm reworking my code dealing with NAIP imagery that has R, G, B, IR 
bands, and I generate a mask band when reprojecting it. I also have the 
option to generate another computed band based on a sobel operator.


In the past, I separated them R, G, B, mask into jpeg ycbcr compressed 
tiff and had separate files for the IR, and the sobel data and used a 
VRT file to pull all these together into a single image for additional 
processing.


Ideally, I would like to have one file, rather than 4 files (RGB, IR, 
sobel, and the VRT) because over large areas the management of all these 
files is a pain. Also when displaying and working with large areas I 
need to mosaic the images into a seemless area using a vrt file or 
tileindex.


I need a way to display the various bands, and mapserver is my tool of 
choice, but I probably need to be able to display them via qgis also, 
which I have not experience with but reading older posts, I might need 
to create a vrt file. In mapserver I can select bands using PROCESSING 
options on layers.


Questions?

1. Does it make sense to try and do this with one tiff with 6 bands?
2. What would be the best workflow for doing this? I'm thinking 
something like:


# remove any collar
nearblack -co TILED=YES -of GTiff -nb 0 -near 0 -setmask -q -o temp1.tif 
doqq.tif


# warp to EPSG:4326 (does -dstalpha convert mask band to an alpha band?)
gdalwarp -t_srs EPSG:4326 -dstalpha -co TILED=YES temp1.tif rgba.tif

# make sobel gray scale single band image
makesobel.py rgba.tif sobel.tif

# create a temp.vrt file with bands R, G, B, Alpha|mask, IR, sobel

# and finally build a tif using the temp.vrt
# I'm not sure what is the best way build the final tif given the mask
# and whether I should use gdalwarp or gdal_translate for this step
gdal_translate -co TILED=YES -co JPEG_QUALITY=90 -co COMPRESS=JPEG 
--CONFIG GDAL_TIFF_INTERNAL_MASK YES temp.vrt final.tif


# and build overviews
gdaladdo -clean -r average final.tif 2 4 8 16 32 64 128

I'm not clear on how the mask band or alpha channel(band?) interacts 
with the various commands at each step in the workflow.


Thanks,
  -Steve

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] ogr2ogr intersect very slow at command line with large sqlite input

2017-04-10 Thread Stephen Woodbridge

Chris,

I'm not suggesting that you change your workflow yet, only that you try 
some experiments. I've worked with both Spatialite and postgis, but I 
have much more experience with postgis.


While they both have many of same functions, the interactions with the 
indexes is much more automatic and intuitive with postgis. I was 
recently surprised to learn that st_intersects() is faster than 
st_dwithin() in postgis. I can't say if the same is true for spatialite 
because of the indexing differences that you have already run into.


Regarding your roads data, OSM stores everything as two noded straight 
line segments, but many import tools chain them together into multiple 
point line segments, which makes the bbox for the linestring larger, 
which causes it to intersect with more other linestring bboxes so more 
computational test.


If you are only looking for intersections at line ends and only 
intersections at line ends where the roads have different names then you 
could try a different approach


create a table of line end points, you could snap the floating point xy 
values to a grid.


create table nodes (nid serial, geom geometry);
insert into nodes (geom)
select distinct geom
select st_snaptogrid(st_startpoint(geom), 0.01) as geom from table
union all
select st_snaptogrid(st_endpoint(geom), 0.01) as geom from table;

Then create a table assigning nid values to the lines

create table node2line (gid integer, nid integer, wend integer);

-- insert the start nodes into table
insert into node2line
select a.gid, b.nid, 0
  from table a, nodes b
 where st_snaptogrid(st_startpoint(a.geom), 0.01)=b.geom;

-- insert the end nodes into table
insert into node2line
select a.gid, b.nid, 1
  from table a, nodes b
 where st_snaptogrid(st_endpoint(a.geom), 0.01)=b.geom;

-- get the intersections from the node ids
select a.nid, a.gid, a.wend, b.gid, b.wend
  from node2line a, node2line b
 where a.nid=b.nid;

You will probably want to add some more filtering to eliminate dups, and 
maybe another join to match sort out street names. But this can be done 
mostly with simple btree indexes.


Also because you have snaptogrid, you can make the geom column text and 
and populate it with st_astext()


I do something similar in postgis in my geocoder to identify intersections.

-Steve

On 4/10/2017 6:38 PM, CTL101 wrote:

Hi again Steve, The input is osm data, filtered so that only highways of some
significance remain. I am trying to get road intersections as points based
on certain criteria. I have an ogr2ogr sqlite based workflow that works on
test data. But scaling up is proving to be a problem for this particular
intersect operation, which occurs early on. Based on my source and sample
data i would say that there are possibly several million intersections
between the tables to compute. The intersection is between lines (roads),
but I am only interested in the point output where lines meet. If there is a
more efficient algorithm that i could use in ogr2ogr then i would be
grateful of advice.

I had not considered postgresql for this part of the process but would
prefer to streamline the workflow rather than create a new one if avoidable.
Might conversion of the sqlite tables to another format in ogr2ogr yield
significant performance dividends? Or do I need to resign myself to trying a
fresh approach?
Regards, Chris



--
View this message in context: 
http://osgeo-org.1560.x6.nabble.com/ogr2ogr-intersect-very-slow-at-command-line-with-large-sqlite-input-tp5316545p5316749.html
Sent from the GDAL - Dev mailing list archive at Nabble.com.
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev




---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] ogr2ogr intersect very slow at command line with large sqlite input

2017-04-10 Thread Stephen Woodbridge

On 4/10/2017 5:51 PM, CTL101 wrote:

I've removed the intersection part after realising that I didn't need it as
the intersection geometry is already calculated in the input  (slow hand
clap me!)

So after investigating a bit more I have:
ogr2ogr -gt unlimited -nlt PROMOTE_TO_MULTI25D --config OGR_SQLITE_CACHE
10240 -f SQLite -dsco SPATIALITE=YES output.sqlite input.sqlite -nln
outputtable -dialect sqlite -sql "SELECT A.* FROM inputtable1 A, inputtable2
B WHERE ST_Intersects(A.geometry, B.geometry) AND A.ogc_fid IN (SELECT
A.ogc_fid FROM SpatialIndex WHERE f_table_name = 'inputtable1' AND
search_frame = B.geometry)"

This is now running on the large tables, and still seems really slow. But it
may actually finish this time so who knows. I have looked at the link that
Jukka suggests, and perused the forum but these seem like very complex
programming based solutions to elaborate problems. Surely there is a simple
way to get more speed out of gdal - especially now that the heavy lifting
has been removed almost altogether from my relatively simple query?


There are a lot of other factors on the speed, like:

how many interactions between the table A and B?
how many points are in each feature?

If you have postgresql installed you could load you two tables there and 
try similar queries to get a comparison of performance.


-Steve

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] I can't get ogr2ogr to overwrite an existing table

2017-04-08 Thread Stephen Woodbridge

On 4/8/2017 12:29 PM, Even Rouault wrote:

On vendredi 7 avril 2017 15:59:49 CEST Stephen Woodbridge wrote:

 > Hi all,

 >

 > I'm having trouble figuring out how to overwrite an existing table with

 > ogr2ogr. I have tried various combos of arguments without success.

 >

 > What am I doing wrong?

I'm not sure. Similar requests work for me with GDAL 2.1. Maybe try to 
manually drop the table (if it is a table...)


Seeing the 'CREATE TABLE "census"."roads" ( "gid" SERIAL, PRIMARY KEY 
("gid") )' statement, it makes me wonder if you are not using an older 
GDAL version (Newer versions will issue a CREATE TABLE with all columns) 
that might have trouble with overwriting existing tables and schemas.


Note for everyone: please mention the version you are using.


Sorry, I know that:

$ ogrinfo --version
GDAL 1.10.1, released 2013/08/26

on Ubuntu 14.04.5 LTS

ok, it is not as convenient but I can drop the table rather than rely on 
-overwrite.


Thanks,
  -Steve

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] I can't get ogr2ogr to overwrite an existing table

2017-04-07 Thread Stephen Woodbridge

Hi all,

I'm having trouble figuring out how to overwrite an existing table with 
ogr2ogr. I have tried various combos of arguments without success.


What am I doing wrong?

ogr2ogr -t_srs EPSG:4326 -nln roads -nlt PROMOTE_TO_MULTI -f PostgreSQL 
-overwrite -lco OVERWRITE=YES -lco PRECISION=NO -lco GEOMETRY_NAME=geom 
-lco FID=gid "PG:dbname=buildings host=localhost port=5435 user=postgres 
active_schema=census" /data/census/tmp-10195/tl_2016_06013_roads.shp


ERROR 1: CREATE TABLE "census"."roads" ( "gid" SERIAL, PRIMARY KEY ("gid") )
ERROR:  relation "roads" already exists

ERROR 1: Terminating translation prematurely after failed
translation of layer tl_2016_06013_roads (use -skipfailures to skip errors)


ogr2ogr -t_srs EPSG:4326 -nln census.roads -nlt PROMOTE_TO_MULTI -f 
PostgreSQL -overwrite -lco OVERWRITE=YES -lco PRECISION=NO -lco 
GEOMETRY_NAME=geom -lco FID=gid "PG:dbname=buildings host=localhost 
port=5435 user=postgres" /data/census/tmp-10747/tl_2016_06013_roads.shp


ERROR 1: CREATE TABLE "census"."roads" ( "gid" SERIAL, PRIMARY KEY ("gid") )
ERROR:  relation "roads" already exists

ERROR 1: Terminating translation prematurely after failed
translation of layer tl_2016_06013_roads (use -skipfailures to skip errors)

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Is it possible to add new fields to a shapefile and populate then?

2017-04-03 Thread Stephen Woodbridge

On 4/3/2017 11:37 PM, jratike80 wrote:

Stephen Woodbridge wrote

Hi All,

I have a shapefile(s) and I want to read the features, generate some
metrics about each feature and then add them to the that feature. I'm
using python and one obvious way to to in effect create a new shapefile
with the columns I need to add, then copy the existing shapefile to the
new one adding the additional metrics.

I wondering if I can add the new columns to the shapefile, then do a
read and update.

Pros and cons to these approaches? and how to do the later one if
possible?

I looked at the python gdal/ogr cookbook and didn't this specific
example. But I thinking of something along the lines of:

1. open the dataset for read/write
2. layer.CreateField() to add the new fields
3. loop through the features in the layer
4.and use feature.SetField()
5.and feature = None to commit the changes

will this work?

Thanks,
   -Steve


Hi,

You can demonstrate that it is possible with ogrinfo:

ogrinfo  -sql "alter table test add column foo integer" test.shp
ogrinfo  -dialect sqlite -sql "update test set foo=2" test.shp


Hi Jukka,

Ahh, good point, I always seem to forget that fact.
I have gotten the python code to add the new fields, but the data is not 
getting written to the file.


# open the shapefile
driver = ogr.GetDriverByName('ESRI Shapefile')
dataSource = driver.Open(infile, 1) # open for rw
if dataSource is None:
print "ERROR: could not open '%s' as shapefile!" % (infile)
sys.exit(1)

layer = dataSource.GetLayer()
layer.CreateField(ogr.FieldDefn("area", ogr.OFTReal))


for feature in layer:
geom = feature.GetGeometryRef()
feature.SetField("area", geom.GetArea())
feature = None

dataSource = None

In the cookbook example for "Create a New Shapefile and Add Data" there 
is a call to:


layer.CreateFeature(feature)

but I do not see an equivalent call for:

layer.UpdateFeature(feature)

So How do I update the feature and force the data to get written to the 
shapefile. I'm presuming the CreateFeature will append a new record 
which is not what I want.


-Steve

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] Is it possible to add new fields to a shapefile and populate then?

2017-04-03 Thread Stephen Woodbridge

Hi All,

I have a shapefile(s) and I want to read the features, generate some 
metrics about each feature and then add them to the that feature. I'm 
using python and one obvious way to to in effect create a new shapefile 
with the columns I need to add, then copy the existing shapefile to the 
new one adding the additional metrics.


I wondering if I can add the new columns to the shapefile, then do a 
read and update.


Pros and cons to these approaches? and how to do the later one if possible?

I looked at the python gdal/ogr cookbook and didn't this specific 
example. But I thinking of something along the lines of:


1. open the dataset for read/write
2. layer.CreateField() to add the new fields
3. loop through the features in the layer
4.and use feature.SetField()
5.and feature = None to commit the changes

will this work?

Thanks,
  -Steve

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Getting black lines between doqqs

2017-03-06 Thread Stephen Woodbridge

Thank you Even! that was the trick and it works great now.

On 3/6/2017 1:35 PM, Even Rouault wrote:

On lundi 6 mars 2017 13:20:50 CET Stephen Woodbridge wrote:


Hi,







I have a collection of doqqs that look like the two below. I trying to



extract a small image the crosses between the two doqqs, but I'm getting



a black line between them. When I serve them via mapserver, I'm not



getting this.







gdalbuildvrt -addalpha tmp-tc-13110.vrt m_4012408_sw_10_h_20160528.tif



m_4012416_nw_10_h_20160528.tif







gdal_translate -of GTiff -outsize 1360 1360 -projwin -124.083968798



40.8779687978 -124.076031202 40.8700312022 -co TILED=YES -co



COMPRESS=JPEG -co JPEG_QUALITY=90 --config GDAL_TIFF_INTERNAL_MASK YES



tmp-tc-13110.vrt oaf2-tc.tif







I get the same result regardless of the -addalpha in the gdalbuildvrt



command.







What am I missing?




If your tiles have colar (which I supposed since they have a mask),
gdalbuildvrt (actually the VRT driver) doesn't take it into account as
you might expect. It doesn't blend alpha/mask, but takes the alpha/mask
value of the last source in the VRT that intersects the window of
interest. You might have more success with



gdalwarp in1.tif in2.tif out.tif -dstalpha -te xmin ymin xmax ymax












-Steve







$ gdalinfo m_4012408_sw_10_h_20160528.tif



Driver: GTiff/GeoTIFF



Files: m_4012408_sw_10_h_20160528.tif



Size is 11140, 10847



Coordinate System is:



GEOGCS["WGS 84",



DATUM["WGS_1984",



SPHEROID["WGS 84",6378137,298.257223563,



AUTHORITY["EPSG","7030"]],



AUTHORITY["EPSG","6326"]],



PRIMEM["Greenwich",0],



UNIT["degree",0.0174532925199433],



AUTHORITY["EPSG","4326"]]



Origin = (-124.127892543854088,40.939476459321497)



Pixel Size = (0.06127013352,-0.06127013352)



Metadata:



AREA_OR_POINT=Area



Image Structure Metadata:



COMPRESSION=YCbCr JPEG



INTERLEAVE=PIXEL



SOURCE_COLOR_SPACE=YCbCr



Corner Coordinates:



Upper Left (-124.1278925, 40.9394765) (124d 7'40.41"W, 40d56'22.12"N)



Lower Left (-124.1278925, 40.8730167) (124d 7'40.41"W, 40d52'22.86"N)



Upper Right (-124.0596376, 40.9394765) (124d 3'34.70"W, 40d56'22.12"N)



Lower Right (-124.0596376, 40.8730167) (124d 3'34.70"W, 40d52'22.86"N)



Center (-124.0937651, 40.9062466) (124d 5'37.55"W, 40d54'22.49"N)



Band 1 Block=256x256 Type=Byte, ColorInterp=Red



Overviews: 5570x5424, 2785x2712, 1393x1356, 697x678, 349x339,



175x170, 88x85



Mask Flags: PER_DATASET



Overviews of mask band: 5570x5424, 2785x2712, 1393x1356, 697x678,



349x339, 175x170, 88x85



Band 2 Block=256x256 Type=Byte, ColorInterp=Green



Overviews: 5570x5424, 2785x2712, 1393x1356, 697x678, 349x339,



175x170, 88x85



Mask Flags: PER_DATASET



Overviews of mask band: 5570x5424, 2785x2712, 1393x1356, 697x678,



349x339, 175x170, 88x85



Band 3 Block=256x256 Type=Byte, ColorInterp=Blue



Overviews: 5570x5424, 2785x2712, 1393x1356, 697x678, 349x339,



175x170, 88x85



Mask Flags: PER_DATASET



Overviews of mask band: 5570x5424, 2785x2712, 1393x1356, 697x678,



349x339, 175x170, 88x85











$ gdalinfo m_4012416_nw_10_h_20160528.tif



Driver: GTiff/GeoTIFF



Files: m_4012416_nw_10_h_20160528.tif



Size is 11145, 10851



Coordinate System is:



GEOGCS["WGS 84",



DATUM["WGS_1984",



SPHEROID["WGS 84",6378137,298.257223563,



AUTHORITY["EPSG","7030"]],



AUTHORITY["EPSG","6326"]],



PRIMEM["Greenwich",0],



UNIT["degree",0.0174532925199433],



AUTHORITY["EPSG","4326"]]



Origin = (-124.127899397118711,40.876995871942221)



Pixel Size = (0.06124881549,-0.06124881549)



Metadata:



AREA_OR_POINT=Area



Image Structure Metadata:



COMPRESSION=YCbCr JPEG



INTERLEAVE=PIXEL



SOURCE_COLOR_SPACE=YCbCr



Corner Coordinates:



Upper Left (-124.1278994, 40.8769959) (124d 7'40.44"W, 40d52'37.19"N)



Lower Left (-124.1278994, 40.8105348) (124d 7'40.44"W, 40d48'37.93"N)



Upper Right (-124.0596376, 40.8769959) (124d 3'34.70"W, 40d52'37.19"N)



Lower Right (-124.0596376, 40.8105348) (124d 3'34.70"W, 40d48'37.93"N)



Center (-124.0937685, 40.8437653) (124d 5'37.57"W, 40d50'37.56"N)



Band 1 Block=256x256 Type=Byte, ColorInterp=Red



Overviews: 5573x5426, 2787x2713, 1394x1357, 697x679, 349x340,



175x170, 88x85



Mask Flags: PER_DATASET



Overviews of mask band: 5573x5426, 2787x2713, 1394x1357, 697x679,



349x340, 175x170, 88x85



Band 2 Block=256x256 Type=Byte, ColorInterp=Green



Overviews: 5573x5426, 2787x2713, 1394x1357, 697x679, 349x340,



175x170, 88x85



Mask Flags: PER_DATASET



Overviews of mask band: 5573x5426, 2787x2713, 1394x1357, 697x679,



349x340, 175x170, 88x85



Band 3 

[gdal-dev] Getting black lines between doqqs

2017-03-06 Thread Stephen Woodbridge

Hi,

I have a collection of doqqs that look like the two below. I trying to 
extract a small image the crosses between the two doqqs, but I'm getting 
a black line between them. When I serve them via mapserver, I'm not 
getting this.


gdalbuildvrt -addalpha tmp-tc-13110.vrt m_4012408_sw_10_h_20160528.tif 
m_4012416_nw_10_h_20160528.tif


gdal_translate -of GTiff -outsize 1360 1360 -projwin -124.083968798 
40.8779687978 -124.076031202 40.8700312022 -co TILED=YES -co 
COMPRESS=JPEG -co JPEG_QUALITY=90 --config GDAL_TIFF_INTERNAL_MASK YES 
tmp-tc-13110.vrt oaf2-tc.tif


I get the same result regardless of the -addalpha in the gdalbuildvrt 
command.


What am I missing?

-Steve

$ gdalinfo m_4012408_sw_10_h_20160528.tif
Driver: GTiff/GeoTIFF
Files: m_4012408_sw_10_h_20160528.tif
Size is 11140, 10847
Coordinate System is:
GEOGCS["WGS 84",
DATUM["WGS_1984",
SPHEROID["WGS 84",6378137,298.257223563,
AUTHORITY["EPSG","7030"]],
AUTHORITY["EPSG","6326"]],
PRIMEM["Greenwich",0],
UNIT["degree",0.0174532925199433],
AUTHORITY["EPSG","4326"]]
Origin = (-124.127892543854088,40.939476459321497)
Pixel Size = (0.06127013352,-0.06127013352)
Metadata:
  AREA_OR_POINT=Area
Image Structure Metadata:
  COMPRESSION=YCbCr JPEG
  INTERLEAVE=PIXEL
  SOURCE_COLOR_SPACE=YCbCr
Corner Coordinates:
Upper Left  (-124.1278925,  40.9394765) (124d 7'40.41"W, 40d56'22.12"N)
Lower Left  (-124.1278925,  40.8730167) (124d 7'40.41"W, 40d52'22.86"N)
Upper Right (-124.0596376,  40.9394765) (124d 3'34.70"W, 40d56'22.12"N)
Lower Right (-124.0596376,  40.8730167) (124d 3'34.70"W, 40d52'22.86"N)
Center  (-124.0937651,  40.9062466) (124d 5'37.55"W, 40d54'22.49"N)
Band 1 Block=256x256 Type=Byte, ColorInterp=Red
  Overviews: 5570x5424, 2785x2712, 1393x1356, 697x678, 349x339, 
175x170, 88x85

  Mask Flags: PER_DATASET
  Overviews of mask band: 5570x5424, 2785x2712, 1393x1356, 697x678, 
349x339, 175x170, 88x85

Band 2 Block=256x256 Type=Byte, ColorInterp=Green
  Overviews: 5570x5424, 2785x2712, 1393x1356, 697x678, 349x339, 
175x170, 88x85

  Mask Flags: PER_DATASET
  Overviews of mask band: 5570x5424, 2785x2712, 1393x1356, 697x678, 
349x339, 175x170, 88x85

Band 3 Block=256x256 Type=Byte, ColorInterp=Blue
  Overviews: 5570x5424, 2785x2712, 1393x1356, 697x678, 349x339, 
175x170, 88x85

  Mask Flags: PER_DATASET
  Overviews of mask band: 5570x5424, 2785x2712, 1393x1356, 697x678, 
349x339, 175x170, 88x85



$ gdalinfo m_4012416_nw_10_h_20160528.tif
Driver: GTiff/GeoTIFF
Files: m_4012416_nw_10_h_20160528.tif
Size is 11145, 10851
Coordinate System is:
GEOGCS["WGS 84",
DATUM["WGS_1984",
SPHEROID["WGS 84",6378137,298.257223563,
AUTHORITY["EPSG","7030"]],
AUTHORITY["EPSG","6326"]],
PRIMEM["Greenwich",0],
UNIT["degree",0.0174532925199433],
AUTHORITY["EPSG","4326"]]
Origin = (-124.127899397118711,40.876995871942221)
Pixel Size = (0.06124881549,-0.06124881549)
Metadata:
  AREA_OR_POINT=Area
Image Structure Metadata:
  COMPRESSION=YCbCr JPEG
  INTERLEAVE=PIXEL
  SOURCE_COLOR_SPACE=YCbCr
Corner Coordinates:
Upper Left  (-124.1278994,  40.8769959) (124d 7'40.44"W, 40d52'37.19"N)
Lower Left  (-124.1278994,  40.8105348) (124d 7'40.44"W, 40d48'37.93"N)
Upper Right (-124.0596376,  40.8769959) (124d 3'34.70"W, 40d52'37.19"N)
Lower Right (-124.0596376,  40.8105348) (124d 3'34.70"W, 40d48'37.93"N)
Center  (-124.0937685,  40.8437653) (124d 5'37.57"W, 40d50'37.56"N)
Band 1 Block=256x256 Type=Byte, ColorInterp=Red
  Overviews: 5573x5426, 2787x2713, 1394x1357, 697x679, 349x340, 
175x170, 88x85

  Mask Flags: PER_DATASET
  Overviews of mask band: 5573x5426, 2787x2713, 1394x1357, 697x679, 
349x340, 175x170, 88x85

Band 2 Block=256x256 Type=Byte, ColorInterp=Green
  Overviews: 5573x5426, 2787x2713, 1394x1357, 697x679, 349x340, 
175x170, 88x85

  Mask Flags: PER_DATASET
  Overviews of mask band: 5573x5426, 2787x2713, 1394x1357, 697x679, 
349x340, 175x170, 88x85

Band 3 Block=256x256 Type=Byte, ColorInterp=Blue
  Overviews: 5573x5426, 2787x2713, 1394x1357, 697x679, 349x340, 
175x170, 88x85

  Mask Flags: PER_DATASET
  Overviews of mask band: 5573x5426, 2787x2713, 1394x1357, 697x679, 
349x340, 175x170, 88x85


---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] Help working with hdf5 images

2017-02-19 Thread Stephen Woodbridge

Hi all,

I'm trying to create GTiff files from IMARS Modis hdf5 images. I did 
something like this about 3-4 years ago when they were using hdf4 
datasets but things have changed since then. So I have two questions:


1. How do I extract the sst data (subdataset 3) into a GTiff and create 
a mask band from the sst_count (subdataset 4) which has value 0 or 1, 
where 1 is valid sst data?


2. The data is in Equidistant_Cylindrical / World Geodetic System 1984 
which does not seem to be represent able as a proj4 format and I need to 
reproject into EPSG:4326


In the past I used the both the gcoos and seacoos images for a given 
time slot like:


gdal_translate -of GTiff -ot Int16 -a_srs EPSG:4326  -a_ullr $ullr 
-scale 0 50 0 1 $filename $tmpfile


where $ullr is correct for the image and then used gdal_merge to combine 
the two images using:


gdal_merge -of GTiff -co TILED=YES -ot Int16 -o $target files>


But this does not work because when I merge the images they are offset 
and I need to apply the mask so the images are transparent where there 
is no data.


I guess this is my week to ask lots of gdal related question.

Thanks,
  -Steve

ftp://imars.marine.usf.edu/modis/imars/final/pass/1km/sst/gcoos/2017.02/aqua.20170217.0645.gcoos.sst.h5

--- hdf5 file --
$ gdalinfo aqua.20170217.0645.gcoos.sst.h5
Driver: HDF5/Hierarchical Data Format Release 5
Files: aqua.20170217.0645.gcoos.sst.h5
Size is 512, 512
Coordinate System is `'
Metadata:
  bands_l2_flags_CLASS=IMAGE
  bands_l2_flags_count_CLASS=IMAGE
  bands_l2_flags_count_description=Count of l2_flags
  bands_l2_flags_count_IMAGE_VERSION=1.2
  bands_l2_flags_count_log10_scaled=false
  bands_l2_flags_count_raster_height=1447
  bands_l2_flags_count_raster_width=2115
  bands_l2_flags_count_scaling_factor=1
  bands_l2_flags_count_scaling_offset=0
  bands_l2_flags_description=l2_flags
  bands_l2_flags_IMAGE_VERSION=1.2
  bands_l2_flags_log10_scaled=false
  bands_l2_flags_raster_height=1447
  bands_l2_flags_raster_width=2115
  bands_l2_flags_scaling_factor=1
  bands_l2_flags_scaling_offset=0
  bands_sst_CLASS=IMAGE
  bands_sst_count_CLASS=IMAGE
  bands_sst_count_description=Count of sst
  bands_sst_count_IMAGE_VERSION=1.2
  bands_sst_count_log10_scaled=false
  bands_sst_count_raster_height=1447
  bands_sst_count_raster_width=2115
  bands_sst_count_scaling_factor=1
  bands_sst_count_scaling_offset=0
  bands_sst_description=sst
  bands_sst_IMAGE_VERSION=1.2
  bands_sst_log10_scaled=false
  bands_sst_raster_height=1447
  bands_sst_raster_width=2115
  bands_sst_scaling_factor=1
  bands_sst_scaling_offset=0
  metadata_Processing_Graph_node.0_authors=Marco Peters, Ralf Quast, 
Marco Zühlk

  metadata_Processing_Graph_node.0_copyright=(c) 2009 by Brockmann Consult
  metadata_Processing_Graph_node.0_id=Mosaic$15A4BDE1324
  metadata_Processing_Graph_node.0_moduleName=beam-gpf
  metadata_Processing_Graph_node.0_moduleVersion=5.0.5
  metadata_Processing_Graph_node.0_operator=Mosaic
  metadata_Processing_Graph_node.0_parameters_combine=OR

metadata_Processing_Graph_node.0_parameters_crs=PROJCS["Equidistant_Cylindrical 
/ World Geodetic System 1984",

  GEOGCS["World Geodetic System 1984",
DATUM["World Geodetic System 1984",
  SPHEROID["WGS 84", 6378137.0, 298.257223563, 
AUTHORITY["EPSG","7030"]],

  AUTHORITY["EPSG","6326"]],
PRIMEM["Greenwich", 0.0, AUTHORITY["EPSG","8901"]],
UNIT["degree", 0.017453292519943295],
AXIS["Geodetic longitude", EAST],
AXIS["Geodetic latitude", NORTH]],
  PROJECTION["Equidistant_Cylindrical"],
  PARAMETER["central_meridian", 0.0],
  PARAMETER["latitude_of_origin", 0.0],
  PARAMETER["standard_parallel_1", 0.0],
  PARAMETER["false_easting", 0.0],
  PARAMETER["false_northing", 0.0],
  UNIT["m", 1.0],
  AXIS["Easting", EAST],
  AXIS["Northing", NORTH]]
  metadata_Processing_Graph_node.0_parameters_eastBound=-79.0
  metadata_Processing_Graph_node.0_parameters_northBound=31.0
  metadata_Processing_Graph_node.0_parameters_orthorectify=false
  metadata_Processing_Graph_node.0_parameters_pixelSizeX=1000.0
  metadata_Processing_Graph_node.0_parameters_pixelSizeY=1000.0
  metadata_Processing_Graph_node.0_parameters_resampling=Nearest
  metadata_Processing_Graph_node.0_parameters_southBound=18.0

metadata_Processing_Graph_node.0_parameters_variables_variable.0_expression=sst
  metadata_Processing_Graph_node.0_parameters_variables_variable.0_name=sst

metadata_Processing_Graph_node.0_parameters_variables_variable.1_expression=l2_flags

metadata_Processing_Graph_node.0_parameters_variables_variable.1_name=l2_flags
  metadata_Processing_Graph_node.0_parameters_westBound=-98.0
  metadata_Processing_Graph_node.0_processingTime=2017-02-17T11:38:49.524Z
  metadata_Processing_Graph_node.0_purpose=Creates a mosaic out of a 
set of source products.



Re: [gdal-dev] Ubuntu GDAL 1.10.1 support for ECW formats

2017-02-16 Thread Stephen Woodbridge

On 2/16/2017 3:04 PM, Sebastiaan Couwenberg wrote:

On 02/16/2017 08:46 PM, Stephen Woodbridge wrote:

Any ideas on this?


ECW support is not enabled in the Debian package, that's why your client
doesn't have it.

You probably have a custom build installed in /usr/local or elsewhere.


Yes, I found that eventually. I finally decided to convert the ecw to 
jpeg compress tiff files since it was a reasonable small collection of 
files and that is working very nicely.


Thank you for your reply,
  -Steve


---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] Ubuntu GDAL 1.10.1 support for ECW formats

2017-02-16 Thread Stephen Woodbridge

Hi all,

On my system: Ubuntu 14.04.5 LTS

I have installed:
gdal-bin:amd64/trusty 1.10.1+dfsg-5ubuntu1 upgradeable to 
2.1.0+dfsg-1~trusty2
libgdal-dev:amd64/trusty 1.10.1+dfsg-5ubuntu1 upgradeable to 
2.1.0+dfsg-1~trusty2

libgdal1h:amd64/trusty 1.10.1+dfsg-5ubuntu1 uptodate

$ gdalinfo --formats|grep ECW
  ECW (rw): ERDAS Compressed Wavelets (SDK 3.x)
  JP2ECW (rw+v): ERDAS JPEG2000 (SDK 3.x)


On a clients system also Ubuntu 14.04.5 LTS, they have the same packages 
installed.


The problem is my system has ECW support, but the client system does not!

gdalinfo file_wgs84.ecw
ERROR 4: `file_wgs84.ecw' not recognised as a supported file format.

I can not figure out why this is the case, or if there is some other 
package then needs to be installed or what the issue. And the gdalinfo 
--formats|grep ECW comand fails to return anything.


Any ideas on this?

-Steve

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Problem with black edges to DOQQs using JPEG in Tiff compression

2017-02-13 Thread Stephen Woodbridge
Jukka, Thanks for your suggestion. We have a couple of requirements 
mapserver is only one, the other is some additional processing, but I 
like the simplicity of your suggestion.


Brian C. spend a lot of time on irc with me and I think we finally got 
the processing chain straightened out and it looks like:


# first preclip any collar (probably not needed)
nearblack -co TILED=YES -of GTiff -nb 0 -near 0 -setmask -q -o tmpfile1 
filepath


gdalwarp -t_srs EPSG:4326 -dstalpha -co TILED=YES tmpfile1 tmpfile2

gdal_translate -co TILED=YES -co JPEG_QUALITY=90 -co COMPRESS=JPEG -co 
PHOTOMETRIC=YCBCR -b 1 -b 2 -b 3 -mask 4 --config 
GDAL_TIFF_INTERNAL_MASK YES tmpfile2 target


gdaladdo -clean -r bilinear target 2 4 8 16 32 64 128

This is pretty fast taking about 9-10 sec per doqq to process.

Thank you for all the suggestions and help.

-Steve

On 2/13/2017 5:02 AM, jratike80 wrote:

Stephen Woodbridge wrote

Even,
Thanks for the quick feedback. I'll will give these suggestions and try
today and let you know how it goes.

Thanks,
   -Steve


Hi Steve,

They are probably good suggestions but folks did not bother to think what is
your ultimate target. It is not to improve you commands and hide the black
pixels but simply to make a jpeg-in-tiff compressed mosaic without seams for
Mapserver, perhaps following some other route if it is easier.

So, my suggestion is:

1) Skip gdalwarp
2) Compress your images in native projection

3) Create jpeg compressed overviews

4) Create tileindex and utilize the super powerful "Tileindexes with tiles
in different projections" feature
http://www.mapserver.org/optimization/tileindex.html

You did not mention if your DOQQs come in several different projections but
I suppose that they do. Otherwise there is even less point to use gdalwarp,
Mapserver is so fast with on-the-fly re-projecting.

-Jukka Rahkonen-





--
View this message in context: 
http://osgeo-org.1560.x6.nabble.com/gdal-dev-Problem-with-black-edges-to-DOQQs-using-JPEG-in-Tiff-compression-tp5307551p5307662.html
Sent from the GDAL - Dev mailing list archive at Nabble.com.
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev




---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Problem with black edges to DOQQs using JPEG in Tiff compression

2017-02-12 Thread Stephen Woodbridge

On 2/12/2017 9:54 AM, Even Rouault wrote:

On samedi 11 février 2017 18:18:35 CET Stephen Woodbridge wrote:


Hi All,







I need your wisdom. I'm downloading NAIP DOQQs in GTiff format and I



have a processing chain something like the following:







gdalwarp -t_srs EPSG:4326 -dstalpha -r "bilinear -multi -co TILED=YES



-dstnodata '0 0 0' srctiff tmpfile







nearblack -nb 15 -q tmpfile







gdal_translate -co TILED=YES -co JPEG_QUALITY=90 -co COMPRESS=JPEG -co



PHOTOMETRIC=YCBCR -b 1 -b 2 -b 3 -mask auto --config



GDAL_TIFF_INTERNAL_MASK YES tmpfile, target







nearblack -nb 5 -q target







gdaladdo -clean -r bilinear target 2 4 8 16 32 64 128 512







And create a tileindex for mapserver of all the tiffs







If I skip the gdal_translate (ie: JPEG compression) and the 2nd



nearblack, the doqq tiles are perfect with no nearblack edges between



the doqq tiles. But when a JPEG compress them, I get edges between the



doqqs like this:







http://imaptools.com:8080/dl/doqq-issue.jpg







I've never used the JPEG in tiff compression and I'm very impressed by



the amount of size reduction there and how good the image remains, but I



have not been able to figure out the magic trick to clearing the edge



artifacts.




Steve,



I managed to replicate something similar to the above with an image I've
at hand.

The cause of the issue is the nearblack invokation after the
gdal_translate. The effect of this nearblack is to "eat" some pixels at
the border of the validity / invalidity transition, but that doesn't
update the existing mask.

If you add -setmask to this nearblack, that should fix it.

Even better, I think you can just remove this second nearblack.



And you probably must add -setalpha for the first nearblack invokation
as well.


Even,

Thanks for the quick feedback. I'll will give these suggestions and try 
today and let you know how it goes.


Thanks,
  -Steve


---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] Problem with black edges to DOQQs using JPEG in Tiff compression

2017-02-11 Thread Stephen Woodbridge

Hi All,

I need your wisdom. I'm downloading NAIP DOQQs in GTiff format and I 
have a processing chain something like the following:


gdalwarp -t_srs EPSG:4326 -dstalpha -r "bilinear -multi -co TILED=YES 
-dstnodata '0 0 0' srctiff tmpfile


nearblack -nb 15 -q tmpfile

gdal_translate -co TILED=YES -co JPEG_QUALITY=90 -co COMPRESS=JPEG -co 
PHOTOMETRIC=YCBCR -b 1 -b 2 -b 3 -mask auto --config 
GDAL_TIFF_INTERNAL_MASK YES  tmpfile, target


nearblack -nb 5 -q target

gdaladdo -clean -r bilinear  target 2 4 8 16 32 64 128 512

And create a tileindex for mapserver of all the tiffs

If I skip the gdal_translate (ie: JPEG compression) and the 2nd 
nearblack, the doqq tiles are perfect with no nearblack edges between 
the doqq tiles. But when a JPEG compress them, I get edges between the 
doqqs like this:


http://imaptools.com:8080/dl/doqq-issue.jpg

I've never used the JPEG in tiff compression and I'm very impressed by 
the amount of size reduction there and how good the image remains, but I 
have not been able to figure out the magic trick to clearing the edge 
artifacts.


Any help would be appreciated.

Thanks,
  -Steve

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] Question on writing a GTiff using python

2016-06-18 Thread Stephen Woodbridge

Hi all,

I'm writing a GTiff using GDAL/Python and it is mostly working. The 
source data is coming from a NETCDF file that is georeferenced in 
Longitude from 74.1600037 to 434.160 degrees


If I use this I get:

Size is 4500, 1782
Coordinate System is:
GEOGCS["WGS 84",
DATUM["WGS_1984",
SPHEROID["WGS 84",6378137,298.257223563,
AUTHORITY["EPSG","7030"]],
AUTHORITY["EPSG","6326"]],
PRIMEM["Greenwich",0],
UNIT["degree",0.0174532925199433],
AUTHORITY["EPSG","4326"]]
Origin = (74.160003662109375,46.987300872802734)
Pixel Size = (0.07959309896,0.07959309896)

Corner Coordinates:
Upper Left  (  74.1600037,  46.9873009) ( 74d 9'36.01"E, 46d59'14.28"N)
Lower Left  (  74.160, 189.547) ( 74d 9'36.01"E,189d32'50.02"N)
Upper Right ( 434.160,  46.987) (Invalid angle, 46d59'14.28"N)
Lower Right ( 434.160, 189.547) (Invalid angle,189d32'50.02"N)
Center  ( 254.160, 118.267) (254d 9'35.68"E,118d16' 2.15"N)

Looks like I also need to flip the Y axis, but ignore that for the moment.

Do I need to reorganize the pixels to be in the range of -180 to 180?
What is the best way t do this?
I can probably slice and splice the numpy array for each row as I copy 
it. Other options?


Thanks,
  -Steve

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Questions on working with HYCOM data in netcdf format

2016-06-06 Thread Stephen Woodbridge

Kurt, Even,

Progress! Check this out:

http://imaptools.com:8080/demo/tiger-hycom.html?zoom=8=31.42299=-79.78765=BTT

I tweaked your script, to copy both the water_u and water_v as separate 
bands (so back to 80 bands), I actually only need a few depths so 
eventually I'll cut that down.


If you zoom out there are lots of blank areas with no currents. I'm not 
sure if this is normal or what.


https://gist.github.com/woodbri/ad0984675b17c45739dbb592bde6639a#file-hycom_uv_netcdf_to_uv_gtiff-py

Here are the mapfile bits:

https://gist.github.com/woodbri/ad0984675b17c45739dbb592bde6639a#file-mapfile-for-hycom-ocean-currents-map

Thanks for the help!

-Steve

On 6/6/2016 2:29 PM, Stephen Woodbridge wrote:

On 6/6/2016 1:25 PM, Kurt Schwehr wrote:

+gdal-dev

Some hackish test code I had laying around from last year...

https://gist.github.com/schwehr/01f6604afc7757ea0a676f0eb28be582



Thank you! I have it running, so next I'll play with the code.


You might also be able to just write 80 layers of u-v, pick out pairs
for each depth layer and have the rendering system do the
conversion(s).  You could use whatever you want once you have a
geotiff.  In this screenshot, I made an 80-layer geotiff of u and v
alternating layers and uploaded that to Earth Engine.  It's been a long
time since I heavily used mapserver, but you could probably do the same
there.

https://www.flickr.com/photos/schwehr/26895009754/


Very cool picture!

This really helps to get me started.

Thanks,
  -Steve


On Mon, Jun 6, 2016 at 7:51 AM, Stephen Woodbridge
<wood...@swoodbridge.com <mailto:wood...@swoodbridge.com>> wrote:

On 6/5/2016 2:04 PM, Kurt Schwehr wrote:

Stephen,

Take a look at these two discussions for starters on working
with hycom:


http://gis.stackexchange.com/questions/167155/how-can-i-make-geotiffs-of-individual-depth-layers-in-the-hycom-ocean-circulatio



Hi Kurt,

In the link above you mention:

I've got some code written the does things like calculate the
speed
inm/s from the UV and creates a geotiff, etc


Can you share this code, it sounds like exactly what I need for the
current vectors? And I'm interested in figuring out how to
read/write and work with the netcdf files.

I think for now I'll work with the "u" datasets as they seem that
they will be much easier place to start.

Thanks,
  -Steve



http://gis.stackexchange.com/questions/170882/how-can-a-make-a-netcdf-with-subgroups-smaller


You should start with the regular grids.  e.g.


ftp://ftp.hycom.org/datasets/GLBu0.08/expt_91.1/data/hindcasts/2015/hycom_glb_911_2015093000_t000_uv3z.nc


Make an 80 layer geotiff from the u/v and then use mapserver
as Even
suggested.

-kurt


On Sat, Jun 4, 2016 at 6:29 AM, Even Rouault
<even.roua...@spatialys.com <mailto:even.roua...@spatialys.com>
<mailto:even.roua...@spatialys.com
<mailto:even.roua...@spatialys.com>>> wrote:

Hi,
>
> I want to convert some HYCOM data into GTiff format and
I'm a little
> lost on how to get started, whether I can do what I need
just using a
> vrt or if I'm going to have to write code to process the
data.
>
> To start with I have placed gdalinfo on one file here:
> http://imaptools.com:8080/dl/hycom-gdalinfo.txt
>
> The data comes from
http://hycom.org/dataserver/glb-analysis
> and I will need to be using GLBa0.08 dataset.
>
> I have two use cases:
>
> 1) The simple one is to extract say Band 1 into a
georeferenced GTiff
> and apply color to using color ranges without loosing the
NODATA. I
> think I can do later using .vrt, correct?

gdaldem color-relief can output to a VRT, but given that
this netCDF
file is
georeferenced through a geolocation array, you'll likely
have first
to run
gdalwarp -geoloc to have something useful at the end.

>
> I'm not sure how to deal with the georeferencing, because
the download
> site says: "Native hycom .[ab] data converted to NetCDF on
native
> Mercator-curvilinear HYCOM horizontal grid" but that does
not seem in
> sync with the Metadata in the file.
>
> 2) The more complicated case is that there are two files:
> a)  eastward_sea_water_velocity (aka: u, u-velocity)
> b)  northward_sea_water_velocity (aka: v, v-velocity)
> that I would like to use to create a grid of vectors with
arrowheads
> using u

Re: [gdal-dev] Questions on working with HYCOM data in netcdf format

2016-06-06 Thread Stephen Woodbridge

On 6/6/2016 1:25 PM, Kurt Schwehr wrote:

+gdal-dev

Some hackish test code I had laying around from last year...

https://gist.github.com/schwehr/01f6604afc7757ea0a676f0eb28be582



Thank you! I have it running, so next I'll play with the code.


You might also be able to just write 80 layers of u-v, pick out pairs
for each depth layer and have the rendering system do the
conversion(s).  You could use whatever you want once you have a
geotiff.  In this screenshot, I made an 80-layer geotiff of u and v
alternating layers and uploaded that to Earth Engine.  It's been a long
time since I heavily used mapserver, but you could probably do the same
there.

https://www.flickr.com/photos/schwehr/26895009754/


Very cool picture!

This really helps to get me started.

Thanks,
  -Steve


On Mon, Jun 6, 2016 at 7:51 AM, Stephen Woodbridge
<wood...@swoodbridge.com <mailto:wood...@swoodbridge.com>> wrote:

On 6/5/2016 2:04 PM, Kurt Schwehr wrote:

Stephen,

Take a look at these two discussions for starters on working
with hycom:


http://gis.stackexchange.com/questions/167155/how-can-i-make-geotiffs-of-individual-depth-layers-in-the-hycom-ocean-circulatio


Hi Kurt,

In the link above you mention:

I've got some code written the does things like calculate the speed
inm/s from the UV and creates a geotiff, etc


Can you share this code, it sounds like exactly what I need for the
current vectors? And I'm interested in figuring out how to
read/write and work with the netcdf files.

I think for now I'll work with the "u" datasets as they seem that
they will be much easier place to start.

Thanks,
  -Steve



http://gis.stackexchange.com/questions/170882/how-can-a-make-a-netcdf-with-subgroups-smaller

You should start with the regular grids.  e.g.


ftp://ftp.hycom.org/datasets/GLBu0.08/expt_91.1/data/hindcasts/2015/hycom_glb_911_2015093000_t000_uv3z.nc

Make an 80 layer geotiff from the u/v and then use mapserver as Even
suggested.

-kurt


On Sat, Jun 4, 2016 at 6:29 AM, Even Rouault
<even.roua...@spatialys.com <mailto:even.roua...@spatialys.com>
<mailto:even.roua...@spatialys.com
<mailto:even.roua...@spatialys.com>>> wrote:

Hi,
>
> I want to convert some HYCOM data into GTiff format and
I'm a little
> lost on how to get started, whether I can do what I need
just using a
> vrt or if I'm going to have to write code to process the data.
>
> To start with I have placed gdalinfo on one file here:
> http://imaptools.com:8080/dl/hycom-gdalinfo.txt
>
> The data comes from http://hycom.org/dataserver/glb-analysis
> and I will need to be using GLBa0.08 dataset.
>
> I have two use cases:
>
> 1) The simple one is to extract say Band 1 into a
georeferenced GTiff
> and apply color to using color ranges without loosing the
NODATA. I
> think I can do later using .vrt, correct?

gdaldem color-relief can output to a VRT, but given that
this netCDF
file is
georeferenced through a geolocation array, you'll likely
have first
to run
gdalwarp -geoloc to have something useful at the end.

>
> I'm not sure how to deal with the georeferencing, because
the download
> site says: "Native hycom .[ab] data converted to NetCDF on
native
> Mercator-curvilinear HYCOM horizontal grid" but that does
not seem in
> sync with the Metadata in the file.
>
> 2) The more complicated case is that there are two files:
> a)  eastward_sea_water_velocity (aka: u, u-velocity)
> b)  northward_sea_water_velocity (aka: v, v-velocity)
> that I would like to use to create a grid of vectors with
arrowheads
> using u and v to define the vector and where the color of
the vector is
> related to the magnitude of its length. Ultimately this
data will get
> rendered via mapserver so I'm wondering if I can do this
via a GTiff or
> if I will have to resort to creating a massive point
shapefile with
> attributes of angle and magnitude and use a symbol. It
seems like it
> would be best if I can keep the data as a GTiff and then
sample the
> points and render the vectors on the fly for when you zoom
in/out.

See
http://mapserver.org/in

Re: [gdal-dev] Questions on working with HYCOM data in netcdf format

2016-06-05 Thread Stephen Woodbridge

Hi Even, Kurt,

Thank you for the suggestions.

I'm reading through stuff and searching for solutions. I need access to 
the 2d layer which is only available via the GLBa0.08 dataset and not in 
the GLBu0.08 dataset which is easier to deal with. I'll start with the 
GLBu0.08 dataset for the other data following Kurt suggestions.


This page describes the problem of georeferencing the GLBa0.08 dataset 
but the solution is based on windows python script and using ArcGIS, but 
the idea might be adapted to writing some C/C++ or script to handle 
creating those artifacts. I want to script the download and processing 
of the data into geotiff's running on Linux.


http://code.nicholas.duke.edu/projects/mget/wiki/HYCOM

One problem is that the HYCOM site does not have *.grid.[ab] files that 
I can find. They are supposed to be in the topo directory:


Index of ftp://ftp.hycom.org/datasets/GLBa0.08/expt_91.2/topo/

NameSizeLast Modified
depth_GLBa0.08_09.a   57984 KB  8/24/2013   12:00:00 AM
depth_GLBa0.08_09.b   1 KB  8/24/2013   12:00:00 AM
depth_GLBa0.08_09.nc 173950 KB  11/12/2013  12:00:00 AM

and it is not obvious how to read and incorporate these files as I think 
they only work with the HYCOM related executables.


I did notice the the lat/lon info is already in the .nc for the "u" 
dataset which can be seen with ncdump. In fact it is also there for the 
"a" dataset also, but the values 1-3298 and 1-4500 which is the size of 
the file in pixels, so not helpful.


I've looked on the HYCOM forum but that has not been very helpful yet.

Thanks,
  -Steve


On 6/5/2016 2:04 PM, Kurt Schwehr wrote:

Stephen,

Take a look at these two discussions for starters on working with
hycom:

http://gis.stackexchange.com/questions/167155/how-can-i-make-geotiffs-of-individual-depth-layers-in-the-hycom-ocean-circulatio


http://gis.stackexchange.com/questions/170882/how-can-a-make-a-netcdf-with-subgroups-smaller

 You should start with the regular grids.  e.g.

ftp://ftp.hycom.org/datasets/GLBu0.08/expt_91.1/data/hindcasts/2015/hycom_glb_911_2015093000_t000_uv3z.nc

 Make an 80 layer geotiff from the u/v and then use mapserver as
Even suggested.

-kurt


On Sat, Jun 4, 2016 at 6:29 AM, Even Rouault
>
wrote:

Hi,


I want to convert some HYCOM data into GTiff format and I'm a
little lost on how to get started, whether I can do what I need
just using a vrt or if I'm going to have to write code to process
the data.

To start with I have placed gdalinfo on one file here:
http://imaptools.com:8080/dl/hycom-gdalinfo.txt

The data comes from http://hycom.org/dataserver/glb-analysis and I
will need to be using GLBa0.08 dataset.

I have two use cases:

1) The simple one is to extract say Band 1 into a georeferenced
GTiff and apply color to using color ranges without loosing the
NODATA. I think I can do later using .vrt, correct?


gdaldem color-relief can output to a VRT, but given that this netCDF
file is georeferenced through a geolocation array, you'll likely have
first to run gdalwarp -geoloc to have something useful at the end.



I'm not sure how to deal with the georeferencing, because the
download site says: "Native hycom .[ab] data converted to NetCDF on
native Mercator-curvilinear HYCOM horizontal grid" but that does
not seem in sync with the Metadata in the file.

2) The more complicated case is that there are two files: a)
eastward_sea_water_velocity (aka: u, u-velocity) b)
northward_sea_water_velocity (aka: v, v-velocity) that I would like
to use to create a grid of vectors with arrowheads using u and v to
define the vector and where the color of the vector is related to
the magnitude of its length. Ultimately this data will get rendered
via mapserver so I'm wondering if I can do this via a GTiff or if I
will have to resort to creating a massive point shapefile with
attributes of angle and magnitude and use a symbol. It seems like
it would be best if I can keep the data as a GTiff and then sample
the points and render the vectors on the fly for when you zoom
in/out.


See http://mapserver.org/input/vector/vector_field.html

You don't need to create a vector file.

You'll have to preprocess your file to get the u and v bands in the
same raster.

Even

-- Spatialys - Geospatial professional services
http://www.spatialys.com
___ gdal-dev mailing
list gdal-dev@lists.osgeo.org 
http://lists.osgeo.org/mailman/listinfo/gdal-dev




-- -- http://schwehr.org



---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

[gdal-dev] Questions on working with HYCOM data in netcdf format

2016-06-03 Thread Stephen Woodbridge

Hi All,

I want to convert some HYCOM data into GTiff format and I'm a little 
lost on how to get started, whether I can do what I need just using a 
vrt or if I'm going to have to write code to process the data.


To start with I have placed gdalinfo on one file here:
http://imaptools.com:8080/dl/hycom-gdalinfo.txt

The data comes from http://hycom.org/dataserver/glb-analysis
and I will need to be using GLBa0.08 dataset.

I have two use cases:

1) The simple one is to extract say Band 1 into a georeferenced GTiff 
and apply color to using color ranges without loosing the NODATA. I 
think I can do later using .vrt, correct?


I'm not sure how to deal with the georeferencing, because the download 
site says: "Native hycom .[ab] data converted to NetCDF on native 
Mercator-curvilinear HYCOM horizontal grid" but that does not seem in 
sync with the Metadata in the file.


2) The more complicated case is that there are two files:
   a)  eastward_sea_water_velocity (aka: u, u-velocity)
   b)  northward_sea_water_velocity (aka: v, v-velocity)
that I would like to use to create a grid of vectors with arrowheads 
using u and v to define the vector and where the color of the vector is 
related to the magnitude of its length. Ultimately this data will get 
rendered via mapserver so I'm wondering if I can do this via a GTiff or 
if I will have to resort to creating a massive point shapefile with 
attributes of angle and magnitude and use a symbol. It seems like it 
would be best if I can keep the data as a GTiff and then sample the 
points and render the vectors on the fly for when you zoom in/out.


Interesting data and I'm really pleased to see gdal can access it! Kudos 
on that.


Any guidance would be greatly appreciated.

Thanks,
  -Steve

---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Reading DNC nautical charts

2015-11-22 Thread Stephen Woodbridge
It looks like the DNC you are looking at are in Vector Product Format, 
so google "gdal vpf" and you will get some links that might help with that.


-Steve

On 11/22/2015 4:06 AM, Hans Rijsdijk wrote:

Hi,

Here is a link:
https://www.nga.mil/ProductsServices/NauticalHydrographicBathymetricProduct/Pages/DigitalNauticalChart.aspx

I have also attached 2 of the charts.

Hope this is useful.

Cheers

Hans


*From: *Nicolas Cadieux
*Sent: *Sunday, 22 November 2015 4:49 PM
*To: *Hans Rijsdijk
*Cc: *gdal-dev@lists.osgeo.org
*Subject: *Re: [gdal-dev] Reading DNC nautical charts

Hi,

If you have a link to the charts, I could look at it and see the format
and see if they are georeferenced (if coordinates like longitude and
latitudes is included in the image file).  They should be.  If so, I
would recommend you use a GIS software like QGIS to visualise the maps
and manipulate the map.  QGIS has plugins that will permit you to plug
in a GPS and to track the movement for navigation.

QGIS uses the GDAL library to open, save and manipulate maps images
(rasters)  and point, line and polygons (vector data) so you get all the
power of GDAL and a user friendly interface.

Nicolas Cadieux M.Sc.

Les Entreprises Archéotec inc.

8548, rue Saint-Denis Montréal H2P 2H2

Téléphone: 514.381.5112  Fax: 514.381.4995

www.archeotec.ca

On Nov 20, 2015 4:13 PM, Hans Rijsdijk  wrote:

 >

 > Can anyone advise if DNC nautical charts can be used on the gdal
software and, if so even better with a gps? In other words for navigation.

 > I am not familiar with gdal software so a 'fatherly' response will be
greatly appreciated.

 >

 > Cheers

 >

 > Hans

 > ___

 > gdal-dev mailing list

 > gdal-dev@lists.osgeo.org

 > http://lists.osgeo.org/mailman/listinfo/gdal-dev




Avast logo   

This email has been checked for viruses by Avast antivirus software.
www.avast.com 




___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev



___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Reading DNC nautical charts

2015-11-22 Thread Stephen Woodbridge

There are two version of Nautical charts from what I can see:

1. DNC from www.nga.mil that you pointed at in VPF format
2. NOAA ENC charts

Here is a link explaining the difference:
http://www.nauticalcharts.noaa.gov/mcd/learn_diffENC_DNC.html

The ENC charts are available in S57 format and maybe as shapefiles. I 
played with the ENC charts a while ago with some success.


I thought OGR supported reading VPF files for VMAP0 data using (OGDI 
Vectors (VPF, VMAP, DCW)). But I have no idea if it would work for DNC data.


You might be able to read these with openmap.bbn.com
http://openmap.bbn.com/doc/OpenMapDevGuide.pdf
see section 8.5

-Steve

On 11/22/2015 4:14 PM, Hans Rijsdijk wrote:

Thanks for your response.
One can get the DNC charts free off the US website (although they may
not make all details available).
For most of my charts I use OpenCPN, but this doesn't run the DNC charts.
Any suggestions of other open source marine software?

Cheers

Hans

On 22 Nov 2015, at 9:40 PM, Brent Wood > wrote:


I think you'll find the Digital Nautical Chart format is not supported
by GDAL.

However there are open source marine navigational software
applications that will read them, but apart from USA & New Zealand,
I'm not aware of any countries that provide their charts for free.

Brent Wood


*From:* Hans Rijsdijk >
*To:* Nicolas Cadieux >
*Cc:* "gdal-dev@lists.osgeo.org "
>
*Sent:* Sunday, November 22, 2015 10:06 PM
*Subject:* Re: [gdal-dev] Reading DNC nautical charts

Hi,
Here is a link:
https://www.nga.mil/ProductsServices/NauticalHydrographicBathymetricProduct/Pages/DigitalNauticalChart.aspx
I have also attached 2 of the charts.
Hope this is useful.
Cheers
Hans



*From: *Nicolas Cadieux
*Sent: *Sunday, 22 November 2015 4:49 PM
*To: *Hans Rijsdijk
*Cc: *gdal-dev@lists.osgeo.org 
*Subject: *Re: [gdal-dev] Reading DNC nautical charts
Hi,
If you have a link to the charts, I could look at it and see the
format and see if they are georeferenced (if coordinates like
longitude and latitudes is included in the image file).  They should
be.  If so, I would recommend you use a GIS software like QGIS to
visualise the maps  and manipulate the map.  QGIS has plugins that
will permit you to plug in a GPS and to track the movement for navigation.
QGIS uses the GDAL library to open, save and manipulate maps images
(rasters)  and point, line and polygons (vector data) so you get all
the power of GDAL and a user friendly interface.
Nicolas Cadieux M.Sc.
Les Entreprises Archéotec inc.
8548, rue Saint-Denis Montréal H2P 2H2
Téléphone: 514.381.5112  Fax: 514.381.4995
www.archeotec.ca 
On Nov 20, 2015 4:13 PM, Hans Rijsdijk > wrote:
>
> Can anyone advise if DNC nautical charts can be used on the gdal
software and, if so even better with a gps? In other words for navigation.
> I am not familiar with gdal software so a 'fatherly' response will
be greatly appreciated.
>
> Cheers
>
> Hans
> ___
> gdal-dev mailing list
> gdal-dev@lists.osgeo.org 
> http://lists.osgeo.org/mailman/listinfo/gdal-dev



Avast logo   
This email has been checked for viruses by Avast antivirus software.
www.avast.com 



___
gdal-dev mailing list
gdal-dev@lists.osgeo.org 
http://lists.osgeo.org/mailman/listinfo/gdal-dev





___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev



___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Adopt RFC48: Geographical networks support

2015-07-31 Thread Stephen Woodbridge

On 7/31/2015 2:51 PM, Dmitry Baryshnikov wrote:

Hi everybody,

The motion of RFC48 has been adopted with support from PSC members
JukkaR, TamasS and EvenR.

The code merged in trunk now (r29585). Let me know if issues arise.

Also the RFC
(https://trac.osgeo.org/gdal/wiki/rfc48_geographical_networks_support) and
RFC common page (https://trac.osgeo.org/gdal/wiki/RfcList) were corrected.



This looks very interesting. I have a few questions which might get 
added to the futures if they are not already supported.


Are there any drivers yet? Which?
  * like read/write OSM data or pgRouting tables

Are there plans to support turn restrictions?
  * OSM defines edge-node-edge restrictions AND edge-edge-edge-... 
restrictions

  * currently OSRM only supports edge-node-edge restrictions
  * pgRouting supports edge-edge-edge-... restrictions

There is a real need to be able to move network data between pgRouting 
and OSRM. The osm2pgrouting tool is undergoing some enhancements during 
the current GSoC. There is also a need to be able to move pgrouting 
tables to OSM pbf files so we can load them into OSRM and this seems 
like an excellent use case for this.


Thanks,
  -Steve
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] How is NODATA interperted with a color table?

2015-06-24 Thread Stephen Woodbridge
I have a geotiff  with one band of byte with a color table, but it does 
not have NODATA set in it. How would I set it NODATA to be to be entry 0 
in the color table.


$ gdalinfo A2015173174000.L0_LAC.L2_OC.tif
Driver: GTiff/GeoTIFF
Files: A2015173174000.L0_LAC.L2_OC.tif
Size is 2433, 1727
Coordinate System is:
GEOGCS[WGS 84,
DATUM[WGS_1984,
SPHEROID[WGS 84,6378137,298.257223563,
AUTHORITY[EPSG,7030]],
AUTHORITY[EPSG,6326]],
PRIMEM[Greenwich,0],
UNIT[degree,0.0174532925199433],
AUTHORITY[EPSG,4326]]
Origin = (-84.000,45.000)
Pixel Size = (0.012741471640766,-0.012738853693008)
Metadata:
  AREA_OR_POINT=Area
Image Structure Metadata:
  INTERLEAVE=BAND
Corner Coordinates:
Upper Left  ( -84.000,  45.000) ( 84d 0' 0.00W, 45d 0' 0.00N)
Lower Left  ( -84.000,  22.997) ( 84d 0' 0.00W, 23d 0' 0.00N)
Upper Right ( -52.995,  45.000) ( 53d 0' 0.00W, 45d 0' 0.00N)
Lower Right ( -52.995,  22.997) ( 53d 0' 0.00W, 23d 0' 0.00N)
Center  ( -68.497,  33.998) ( 68d30' 0.00W, 34d 0' 0.00N)
Band 1 Block=2433x3 Type=Byte, ColorInterp=Palette
  Color Table (RGB with 256 entries)
0: 147,0,108,255
...
  255: 0,0,0,255

I tried the following:

$ gdal_translate -a_nodata 0 -co TILED=YES A2015173174000.L0_LAC.tif 
test-tc.tif

Input file size is 2433, 1727
0...10...20...30...40...50...60...70...80...90...100 - done.
[] ~/work/oceandata/test$ gdalinfo test-tc.tif
Driver: GTiff/GeoTIFF
Files: test-tc.tif
Size is 2433, 1727
Coordinate System is:
GEOGCS[WGS 84,
DATUM[WGS_1984,
SPHEROID[WGS 84,6378137,298.257223563,
AUTHORITY[EPSG,7030]],
AUTHORITY[EPSG,6326]],
PRIMEM[Greenwich,0],
UNIT[degree,0.0174532925199433],
AUTHORITY[EPSG,4326]]
Origin = (-84.000,45.000)
Pixel Size = (0.012741471640766,-0.012738853693008)
Metadata:
  AREA_OR_POINT=Area
Image Structure Metadata:
  INTERLEAVE=PIXEL
Corner Coordinates:
Upper Left  ( -84.000,  45.000) ( 84d 0' 0.00W, 45d 0' 0.00N)
Lower Left  ( -84.000,  22.997) ( 84d 0' 0.00W, 23d 0' 0.00N)
Upper Right ( -52.995,  45.000) ( 53d 0' 0.00W, 45d 0' 0.00N)
Lower Right ( -52.995,  22.997) ( 53d 0' 0.00W, 23d 0' 0.00N)
Center  ( -68.497,  33.998) ( 68d30' 0.00W, 34d 0' 0.00N)
Band 1 Block=256x256 Type=Byte, ColorInterp=Red
  NoData Value=0
Band 2 Block=256x256 Type=Byte, ColorInterp=Green
  NoData Value=0
Band 3 Block=256x256 Type=Byte, ColorInterp=Blue
  NoData Value=0

But that does not look correct, so I'm checking. Also I don't see any 
note that the tif is TILED?


Thanks,
  -Steve
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] gdal_calc.py looses the color lookup table

2015-06-24 Thread Stephen Woodbridge

On 6/24/2015 8:59 AM, Even Rouault wrote:

Le mercredi 24 juin 2015 03:49:15, Stephen Woodbridge a écrit :

Hi All,

I have a geotiff file like:

[] ~/work/oceandata/test$ gdalinfo A2015173174000.L0_LAC.L2_OC.tif -noct
Driver: GTiff/GeoTIFF
Files: A2015173174000.L0_LAC.L2_OC.tif
 A2015173174000.L0_LAC.L2_OC.tif.aux.xml
Size is 2433, 1727
Coordinate System is:
GEOGCS[WGS 84,
  DATUM[WGS_1984,
  SPHEROID[WGS 84,6378137,298.257223563,
  AUTHORITY[EPSG,7030]],
  AUTHORITY[EPSG,6326]],
  PRIMEM[Greenwich,0],
  UNIT[degree,0.0174532925199433],
  AUTHORITY[EPSG,4326]]
Origin = (-84.000,45.000)
Pixel Size = (0.012741471640766,-0.012738853693008)
Metadata:
AREA_OR_POINT=Area
Image Structure Metadata:
INTERLEAVE=BAND
Corner Coordinates:
Upper Left  ( -84.000,  45.000) ( 84d 0' 0.00W, 45d 0' 0.00N)
Lower Left  ( -84.000,  22.997) ( 84d 0' 0.00W, 23d 0' 0.00N)
Upper Right ( -52.995,  45.000) ( 53d 0' 0.00W, 45d 0' 0.00N)
Lower Right ( -52.995,  22.997) ( 53d 0' 0.00W, 23d 0' 0.00N)
Center  ( -68.497,  33.998) ( 68d30' 0.00W, 34d 0' 0.00N)
Band 1 Block=2433x3 Type=Byte, ColorInterp=Palette
Min=0.000 Max=250.000
Minimum=0.000, Maximum=250.000, Mean=0.850, StdDev=9.542
Metadata:
  STATISTICS_MAXIMUM=250
  STATISTICS_MEAN=0.85033548789076
  STATISTICS_MINIMUM=0
  STATISTICS_STDDEV=9.542216344358
Color Table (RGB with 256 entries)

It has multiple entries to the color lookup table that should be treated
as NODATA for example entries 251-255.

So I run gdal_calc.py like:

[] ~/work/oceandata/test$ gdal_calc.py -A
A2015173174000.L0_LAC.L2_OC.tif --outfile=test.tif --calc=A*(A251)
--NoDataValue=0 --co=TILED=YES
0 .. 10 .. 20 .. 30 .. 40 .. 50 .. 60 .. 70 .. 80 .. 90 .. 100 - Done

and it create a gray-scale image with no color lut.

[] ~/work/oceandata/test$ gdalinfo test.tif
Driver: GTiff/GeoTIFF
Files: test.tif
Size is 2433, 1727
Coordinate System is:
GEOGCS[WGS 84,
  DATUM[WGS_1984,
  SPHEROID[WGS 84,6378137,298.257223563,
  AUTHORITY[EPSG,7030]],
  AUTHORITY[EPSG,6326]],
  PRIMEM[Greenwich,0],
  UNIT[degree,0.0174532925199433],
  AUTHORITY[EPSG,4326]]
Origin = (-84.000,45.000)
Pixel Size = (0.012741471640766,-0.012738853693008)
Metadata:
AREA_OR_POINT=Area
Image Structure Metadata:
INTERLEAVE=BAND
Corner Coordinates:
Upper Left  ( -84.000,  45.000) ( 84d 0' 0.00W, 45d 0' 0.00N)
Lower Left  ( -84.000,  22.997) ( 84d 0' 0.00W, 23d 0' 0.00N)
Upper Right ( -52.995,  45.000) ( 53d 0' 0.00W, 45d 0' 0.00N)
Lower Right ( -52.995,  22.997) ( 53d 0' 0.00W, 23d 0' 0.00N)
Center  ( -68.497,  33.998) ( 68d30' 0.00W, 34d 0' 0.00N)
Band 1 Block=256x256 Type=Byte, ColorInterp=Gray
NoData Value=0


[] ~/work/oceandata/test$ gdalinfo --version
GDAL 1.10.1, released 2013/08/26

So is there away to get gdal_calc.py to copy the LUT to the output file?
Is there a better way/different way to set these pixels to be NODATA?


Steve,

https://svn.osgeo.org/gdal/trunk/gdal/swig/python/samples/attachpct.py could
be used as post processing to re-apply the color table.

Even


Even,

Thank you! you have come to my rescue again.

Best regards,
  -Steve



Thanks,
-Steve
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev




___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] ogr2ogr: Stuck trying to append lots of gml files to postgis tables

2015-03-22 Thread Stephen Woodbridge

On 3/22/2015 3:32 PM, Even Rouault wrote:

Le dimanche 22 mars 2015 20:18:12, Stephen Woodbridge a écrit :

Hi all,

I have a directory of gml files. I can load any of them into postgis
without a problem. What I'm having a problem with is appending each file
to the existing tables. I'm using : GDAL 1.10.1, released 2013/08/26

$ ogrinfo raw_itn/6360199_sn6103_2c7157.gz
Had to open data source read-only.
INFO: Open of `raw_itn/6360199_sn6103_2c7157.gz'
using driver `GML' successful.
1: Road (None)
2: RoadLink (Line String)
3: RoadNode (Point)

$ ogrinfo raw_itn/6360199_sn6101_2c5532.gz
Had to open data source read-only.
INFO: Open of `raw_itn/6360199_sn6101_2c5532.gz'
using driver `GML' successful.
1: RoadLinkInformation (Point)
2: Road (None)
3: RoadLink (Line String)
4: RoadNodeInformation (None)
5: RoadNode (Point)

Here is want I am trying:

dropdb -U postgres -h localhost data_itn
createdb -U postgres -h localhost data_itn
psql -U postgres -h localhost data_itn -c create extension postgis
psql -U postgres -h localhost data_itn -c create schema itn
psql -U postgres -h localhost data_itn -c alter database data_itn set
search_path to itn, public, pg_catalog

# this loads fine

ogr2ogr -append -f PostgreSQL PG:host=localhost user=postgres
dbname=data_itn raw_itn/6360199_sn6101_2c5532.gz

# all successive tries to append to the existing tables fails

ogr2ogr -append -f PostgreSQL PG:host=localhost user=postgres
dbname=data_itn raw_itn/6360199_sn6103_2c7157.gz
ERROR 1: Layer road already exists, CreateLayer failed.
Use the layer creation option OVERWRITE=YES to replace it.
ERROR 1: Terminating translation prematurely after failed
translation of layer Road (use -skipfailures to skip errors)

If I don't install extension postgis then there is no problem reported
and it appears to load all the data by appending the tables with the
geometry in a colum wkb_geometry::bytea.

How can I load the data into postgis. I have over 700 gml files to load.


Defining PG_LIST_ALL_TABLES=YES as environment variable should solve this. By
default non-spatial tables are not listed when opening a PG database, and
ogr2ogr is confused as it doesn't see the existing table as existing, so it
tries to recreate it. With trunk, ogr2ogr  the PG driver have been improved
so that specifying  PG_LIST_ALL_TABLES=YES is not needed in that scenario.

Wondering if we shouldn't list all tables (excluding system tables) by
default...


Thank you this is a very helpful response. I saw the 
PG_LIST_ALL_TABLES=YES in the documentation, but but never connect that 
with this problem.


If there is not a huge downside to listing all tables I can't see that 
it is a problem because it only happens once after a connection. Or 
maybe it only needs to be done if -append is set.


Anyway, Thanks for the help with this.

-Steve



Thanks,
-Steve

---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev





---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] ogr2ogr: Stuck trying to append lots of gml files to postgis tables

2015-03-22 Thread Stephen Woodbridge
No, I tried that also, but thank you for the suggestion. Running the 
command like this resolved the problem as Even suggested.


PG_LIST_ALL_TABLES=YES ogr2ogr -update -append -f PostgreSQL 
PG:host=localhost user=postgres dbname=data_itn 
raw_itn/6360199_sn6103_2c7157.__gz


Thanks,
  -Steve

On 3/22/2015 9:10 PM, Saulteau Don wrote:

I wonder if part of the problem is that you haven't included the -update
flag as well in the ogr2ogr command.

ogr2ogr -update -append -f PostgreSQL PG:host=localhost user=postgres
dbname=data_itn raw_itn/6360199_sn6103_2c7157.__gz

Then like the warning says, there's also the lco option of -lco
OVERWRITE=YES
Which also overwrites an existing layer, but it will retain any VIEWS in
the postgresql database that are built using the layer you're appending
or updating.



Donovan



On Sun, Mar 22, 2015 at 1:33 PM, Stephen Woodbridge
wood...@swoodbridge.com mailto:wood...@swoodbridge.com wrote:

On 3/22/2015 3:32 PM, Even Rouault wrote:

Le dimanche 22 mars 2015 20:18:12, Stephen Woodbridge a écrit :

Hi all,

I have a directory of gml files. I can load any of them into
postgis
without a problem. What I'm having a problem with is
appending each file
to the existing tables. I'm using : GDAL 1.10.1, released
2013/08/26

$ ogrinfo raw_itn/6360199_sn6103_2c7157.__gz
Had to open data source read-only.
INFO: Open of `raw_itn/6360199_sn6103___2c7157.gz'
 using driver `GML' successful.
1: Road (None)
2: RoadLink (Line String)
3: RoadNode (Point)

$ ogrinfo raw_itn/6360199_sn6101_2c5532.__gz
Had to open data source read-only.
INFO: Open of `raw_itn/6360199_sn6101___2c5532.gz'
 using driver `GML' successful.
1: RoadLinkInformation (Point)
2: Road (None)
3: RoadLink (Line String)
4: RoadNodeInformation (None)
5: RoadNode (Point)

Here is want I am trying:

dropdb -U postgres -h localhost data_itn
createdb -U postgres -h localhost data_itn
psql -U postgres -h localhost data_itn -c create extension
postgis
psql -U postgres -h localhost data_itn -c create schema itn
psql -U postgres -h localhost data_itn -c alter database
data_itn set
search_path to itn, public, pg_catalog

# this loads fine

ogr2ogr -append -f PostgreSQL PG:host=localhost user=postgres
dbname=data_itn raw_itn/6360199_sn6101_2c5532.__gz

# all successive tries to append to the existing tables fails

ogr2ogr -append -f PostgreSQL PG:host=localhost user=postgres
dbname=data_itn raw_itn/6360199_sn6103_2c7157.__gz
ERROR 1: Layer road already exists, CreateLayer failed.
Use the layer creation option OVERWRITE=YES to replace it.
ERROR 1: Terminating translation prematurely after failed
translation of layer Road (use -skipfailures to skip errors)

If I don't install extension postgis then there is no
problem reported
and it appears to load all the data by appending the tables
with the
geometry in a colum wkb_geometry::bytea.

How can I load the data into postgis. I have over 700 gml
files to load.


Defining PG_LIST_ALL_TABLES=YES as environment variable should
solve this. By
default non-spatial tables are not listed when opening a PG
database, and
ogr2ogr is confused as it doesn't see the existing table as
existing, so it
tries to recreate it. With trunk, ogr2ogr  the PG driver have
been improved
so that specifying  PG_LIST_ALL_TABLES=YES is not needed in that
scenario.

Wondering if we shouldn't list all tables (excluding system
tables) by
default...


Thank you this is a very helpful response. I saw the
PG_LIST_ALL_TABLES=YES in the documentation, but but never connect
that with this problem.

If there is not a huge downside to listing all tables I can't see
that it is a problem because it only happens once after a
connection. Or maybe it only needs to be done if -append is set.

Anyway, Thanks for the help with this.

-Steve



Thanks,
 -Steve

---
This email has been checked for viruses by Avast antivirus
software.
http://www.avast.com

_
gdal-dev mailing list
gdal-dev@lists.osgeo.org mailto:gdal-dev@lists.osgeo.org
http://lists.osgeo.org/__mailman/listinfo/gdal-dev
http://lists.osgeo.org/mailman/listinfo

Re: [gdal-dev] a postgis table to another

2014-11-16 Thread Stephen Woodbridge

On 11/16/2014 8:01 AM, Ahmet Temiz wrote:

Hello

Is it possible to export (CREATE  )a postgis table to another in
different projection?
regards


There are a few ways that I can think of to do this:

1. add another column to the table with the project you want. Then 
export using that column, maybe using a view.


2. just define a view the does the reprojection to what you want and 
export that.


select st_addgeometrycolumn(mytable, geom_4326, 2, 
MULTILINESTRING, 4326);

update mytable set geom_4326=st_transform(geom, 4325);

pgsql2shp ... -g geom_4326 mydb mytable -f outfile.shp

create view table_4326 as select ..., st_transform(geom, 4326) as geom;
pgsql2shp ... -f outfile.shp table_4326

3. you can export a table using a SQL command and you couls do the 
reprojection using that.


pgsql2shp ... -f outfile.shp  select ..., st_transform(geom, 4326) as geom

-Steve
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Question on scaling between tif and vrt

2014-05-19 Thread Stephen Woodbridge

Hi all,

I have a .tif file with -ot Int16 and an associated .vrt file that is 
applying a color palette and is type Byte.


How does if decide on how to scale the pixel values?

Does it compute the range and then just scale and offset that values to 
the new range?


Below are the gdalinfo for each.

Thanks,
  -Steve

gdalinfo -stats MODIS_SST_EAST-201405190900.tif
Driver: GTiff/GeoTIFF
Files: MODIS_SST_EAST-201405190900.tif
   MODIS_SST_EAST-201405190900.tif.aux.xml
Size is 8800, 6600
Coordinate System is:
GEOGCS[WGS 84,
DATUM[WGS_1984,
SPHEROID[WGS 84,6378137,298.257223563,
AUTHORITY[EPSG,7030]],
AUTHORITY[EPSG,6326]],
PRIMEM[Greenwich,0],
UNIT[degree,0.0174532925199433],
AUTHORITY[EPSG,4326]]
Origin = (-125.000,54.931818190930159)
Pixel Size = (0.009090548548341,-0.009058565004476)
Metadata:
  AREA_OR_POINT=Area
Image Structure Metadata:
  INTERLEAVE=BAND
Corner Coordinates:
Upper Left  (-125.000,  54.9318182) (125d 0' 0.00W, 54d55'54.55N)
Lower Left  (-125.000,  -4.8547108) (125d 0' 0.00W,  4d51'16.96S)
Upper Right ( -45.0031728,  54.9318182) ( 45d 0'11.42W, 54d55'54.55N)
Lower Right ( -45.0031728,  -4.8547108) ( 45d 0'11.42W,  4d51'16.96S)
Center  ( -85.0015864,  25.0385537) ( 85d 0' 5.71W, 25d 2'18.79N)
Band 1 Block=8800x1 Type=Int16, ColorInterp=Gray
  Min=-10004.000 Max=6035.000
  Minimum=-10004.000, Maximum=6035.000, Mean=-673.234, StdDev=268.660
  Overviews: 4400x3300, 2200x1650, 1100x825, 550x413, 275x207
  Metadata:
STATISTICS_MAXIMUM=6035
STATISTICS_MEAN=-673.23378700069
STATISTICS_MINIMUM=-10004
STATISTICS_STDDEV=268.66013989253

gdalinfo -stats -noct MODIS_SST_EAST-201405190900.vrt
Driver: VRT/Virtual Raster
Files: /maps/wms/data/MODIS_SST_EAST/MODIS_SST_EAST-201405190900.vrt
   /maps/wms/data/MODIS_SST_EAST/MODIS_SST_EAST-201405190900.tif
Size is 8800, 6600
Coordinate System is `'
Origin = (-125.000,54.93657929998)
Pixel Size = (0.009090548056818,-0.009058564515152)
Corner Coordinates:
Upper Left  (-125.000,  54.9365793)
Lower Left  (-125.000,  -4.8499465)
Upper Right ( -45.0031771,  54.9365793)
Lower Right ( -45.0031771,  -4.8499465)
Center  ( -85.0015886,  25.0433164)
Band 1 Block=128x128 Type=Byte, ColorInterp=Palette
  Minimum=0.000, Maximum=255.000, Mean=1.885, StdDev=21.797
  Metadata:
STATISTICS_MAXIMUM=255
STATISTICS_MEAN=1.8849891356749
STATISTICS_MINIMUM=0
STATISTICS_STDDEV=21.796961177215
  Color Table (RGB with 256 entries)
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] Question on scaling between tif and vrt

2014-05-19 Thread Stephen Woodbridge

Even,

Thanks!

I didn't set his up, I just inherited the maintenance of it and I'm just 
trying to understand how it works and eventually make changes to it and 
fix some of the processing that does not seem to be grounded in a good 
process, which is looking like more and more of the code. :(


I get the point about the color table values need to be values from 0-n 
and you need 0-n entries in the color table.  Or you need to use LUT 
like you explained before.


What I was trying to understand is more along the lines of

input tif is Int16
vrt is declared as Byte

How does GDAL deal with this? There are no errors or warnings. gdalinfo 
is reporting a Byte range of values for the vrt. I'm wondering how it is 
interpreting an arbitrary Int16 value into a Byte?


So some of the obvious guesses are:

1. it is scaling them (you say no on this)
2. it is reading the Int16 and only using only the low or high order 
byte as the value?

3. something else?

Thank you for your patience with all my questions.

-Steve

On 5/19/2014 5:48 PM, Even Rouault wrote:

Le lundi 19 mai 2014 23:42:06, Stephen Woodbridge a écrit :

Hi all,

I have a .tif file with -ot Int16 and an associated .vrt file that is
applying a color palette and is type Byte.

How does if decide on how to scale the pixel values?

Does it compute the range and then just scale and offset that values to
the new range?


Not completely sure to understand what you really mean, but I'd say no scaling
nor offseting.
Color table is plain and stupid. You need to provide as many entries in the
color tables as values in the input file, and the index of the color table must
match the value of the pixel in the source band. And this is not possible in
your case since the minimum value is negative (as said in previous related
discussion), and the color table entries are indexed starting at 0.

-- use LUT



Below are the gdalinfo for each.

Thanks,
-Steve

gdalinfo -stats MODIS_SST_EAST-201405190900.tif
Driver: GTiff/GeoTIFF
Files: MODIS_SST_EAST-201405190900.tif
 MODIS_SST_EAST-201405190900.tif.aux.xml
Size is 8800, 6600
Coordinate System is:
GEOGCS[WGS 84,
  DATUM[WGS_1984,
  SPHEROID[WGS 84,6378137,298.257223563,
  AUTHORITY[EPSG,7030]],
  AUTHORITY[EPSG,6326]],
  PRIMEM[Greenwich,0],
  UNIT[degree,0.0174532925199433],
  AUTHORITY[EPSG,4326]]
Origin = (-125.000,54.931818190930159)
Pixel Size = (0.009090548548341,-0.009058565004476)
Metadata:
AREA_OR_POINT=Area
Image Structure Metadata:
INTERLEAVE=BAND
Corner Coordinates:
Upper Left  (-125.000,  54.9318182) (125d 0' 0.00W, 54d55'54.55N)
Lower Left  (-125.000,  -4.8547108) (125d 0' 0.00W,  4d51'16.96S)
Upper Right ( -45.0031728,  54.9318182) ( 45d 0'11.42W, 54d55'54.55N)
Lower Right ( -45.0031728,  -4.8547108) ( 45d 0'11.42W,  4d51'16.96S)
Center  ( -85.0015864,  25.0385537) ( 85d 0' 5.71W, 25d 2'18.79N)
Band 1 Block=8800x1 Type=Int16, ColorInterp=Gray
Min=-10004.000 Max=6035.000
Minimum=-10004.000, Maximum=6035.000, Mean=-673.234, StdDev=268.660
Overviews: 4400x3300, 2200x1650, 1100x825, 550x413, 275x207
Metadata:
  STATISTICS_MAXIMUM=6035
  STATISTICS_MEAN=-673.23378700069
  STATISTICS_MINIMUM=-10004
  STATISTICS_STDDEV=268.66013989253

gdalinfo -stats -noct MODIS_SST_EAST-201405190900.vrt
Driver: VRT/Virtual Raster
Files: /maps/wms/data/MODIS_SST_EAST/MODIS_SST_EAST-201405190900.vrt
 /maps/wms/data/MODIS_SST_EAST/MODIS_SST_EAST-201405190900.tif
Size is 8800, 6600
Coordinate System is `'
Origin = (-125.000,54.93657929998)
Pixel Size = (0.009090548056818,-0.009058564515152)
Corner Coordinates:
Upper Left  (-125.000,  54.9365793)
Lower Left  (-125.000,  -4.8499465)
Upper Right ( -45.0031771,  54.9365793)
Lower Right ( -45.0031771,  -4.8499465)
Center  ( -85.0015886,  25.0433164)
Band 1 Block=128x128 Type=Byte, ColorInterp=Palette
Minimum=0.000, Maximum=255.000, Mean=1.885, StdDev=21.797
Metadata:
  STATISTICS_MAXIMUM=255
  STATISTICS_MEAN=1.8849891356749
  STATISTICS_MINIMUM=0
  STATISTICS_STDDEV=21.796961177215
Color Table (RGB with 256 entries)
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev




___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] How to map Int32 or Float32 GTiff to color map?

2014-05-15 Thread Stephen Woodbridge

On 5/15/2014 1:56 PM, Even Rouault wrote:

Le jeudi 15 mai 2014 16:25:12, Stephen Woodbridge a écrit :

Hi all,

I'm trying to create a VRT file for a GTiff that is Int32 or Float32 to
add a ColorTable but I seem to be missing a key piece of information.
How are the pixel values mapped to the color table entries? via the
histogram?

My GTiff has noData=-32767 and good values the range from say -5 to 100.
So do I create a color table that is based on the 256 buckets in the
histogram?


No way you can use a GDAL color table to map negative values...
Perhaps you should use gdaldem color-relief to generate a RGB output. Note
that gdaldem color-relief is compatible with VRT output : it uses the LUT
mechanism to do so.


OK, I will look at gdaldem, but not wanting to give up on this yet, is 
there a way to change the NoData pixels from -32767 to say -20 then 
scale 1.0 and offset the value by 10.


This would make NoData=0

old   = new
 -55
  0   10
  1   11
100  110

then I would have all positive values and the range would be manageable?

The scale and offset are already supported in VRT, but I'm not sure if 
it is possible to remap -32767 values to a new value.


Thanks for your response.

-Steve



So for example I get:

Band 1 Block=256x256 Type=Float32, ColorInterp=Gray
Min=-32767.000 Max=33.980   Computed Min/Max=-54.985,34.320
Minimum=-32767.000, Maximum=33.980, Mean=-15887.345, StdDev=16386.592
256 buckets from -32831.1 to 98.0444:
1545169 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 27375 1924356
NoData Value=-32767
Metadata:
  STATISTICS_MAXIMUM=33.97542236
  STATISTICS_MEAN=-15887.344614501
  STATISTICS_MINIMUM=-32767
  STATISTICS_STDDEV=16386.59208246


It seems that the NoData value is included in the stats which really
skews the numbers compressing all my values into two buckets! How can I
avoid that? I want to ignore the NoData values and spread my good values
into the remaining buckets. How can I do this?


You can't. There's a ticket I think about the fact we should ignore nodata
value in stats.



Thanks,
-Steve

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev




___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] GSoC proposal looking for mentors and suggestions

2014-03-20 Thread Stephen Woodbridge
I believe that OSGeo is expecting every approved student to have a 
mentor and a co-mentor this year.


I have been a mentor for the last 5+ years for pgRouting. We have had 2 
students most of those years and two mentors with each mentor being the 
co-mentor for the other project. This has worked well for us. This 
allows one of us to be gone but to still provide coverage for both projects.


The thing that I have found over the years is that it is important to 
help your students set realistic and conservative goals especially if 
they have not done previous development on the project. If you don't 
know the issues then everything seems trivial, and students are 
wonderfully optimistic but they rapidly get behind and overwhelmed. We 
combat this by having them set minimum goals and stretch goals. The 
minimum are required to get a passing grade. Think of it as have to 
have vs nice to have.


Hope this helps. Feel free to contact me off list if you want to discuss 
mentoring more.


-Steve

On 3/20/2014 4:13 PM, Even Rouault wrote:

Le jeudi 20 mars 2014 05:02:33, Zhang, Shuai a écrit :

Hi All,

I think i need a mentor working with me and help me make gdal under mongodb
support. Below is the proposal i wrote, hopefully you find it worth a
trial.


This is something I may potentially mentor, but there are already 2 students
interested on other subjects. I'm not sure how many will get eventually
selected by the GSOC program, but I won't be able to mentor 3 people for sure
!



Thanks,
shuai


Title: OGR Driver for MongoDB

Short description:
MongoDB, a document database that provides high performance, high
availability, and easy scalability, can be a good platform for storing
extremely large spatial datasets, to support high performance
geo-computation and real-time spatial analysis in a large scale.This
project aims at developing a OGR Driver for MongoDB to help applications
or softwares based on GDAL, such QGIS, Geoserver, Mapserver, and so on,
read  write the spatial data in it, and thus enable the Open Source GIS
Ecosystem powered by the advanced NoSQL database.

Describe your idea
1. Introduction
MongoDB,  a document database that provides high performance, high
availability, and easy scalability, can be a good platform for storing
extremely large spatial datasets, to support high performance
geo-computation and real-time spatial analysis in a large scale. Yet,
there is little attention so far that GIS fields pay to make most of its
strength. This project aims at developing a OGR Driver for MongoDB to help
applications or softwares based on GDAL read  write the spatial data in
it, and thus enable the Open Source GIS Ecosystem powered by the advanced
NoSQL database.

  2. Background
Since we are living in the era of big data, tools and equipment today for
capturing spatial data both at the mega-scale and the milli-scale are just
dreadful. The magnitude of this data volume is well beyond the capability
of any mainstream geographic information systems. Yet, we, GIS fields,
have no off-the-shelf solutions to manage these massive spatial data.
Relational spatial databases have taken in charge for decades but now the
situation seems a little different.

A computing pattern shift can be seen throughout the IT industry in recent
years and GIS would be no exception. Especially, data analytics may not be
achievable within a reasonable amount of time without resorting to
high-performance computing strategies. However, relational spatial
databases are kind of slow to support these high-performance computing
scenarios, and often lack of flexible scalability to handle a growing
amount of work in a capable manner.

Fortunately, there are several groups trying to address the problem, and
MongoDB is an apparent leader in this direction. MongoDB, which has native
support for maintaining geospatial data, using a document-oriented model,
lies in fifth place in the DB-Engines Ranking of database management
systems classed according to popularity and the highest rated
non-relational system. From version 2.4 (released on March 19, 2013),
MongoDB introduces support for a subset of GeoJSON geometries including
basic shapes like points, linestrings, polygons.


Good to know. Last time I looked, MongoDB had only support for point
geometries.


And quite a number of
partners related with big data, NoSQL, cloud, mobile and high performance
computing join the MongoDB ecosystem. Foursquare is featured one of them
which benefits from MongoDB’s support for geospatial indexing, allowing it
to easily query for large location-based data.

3. The idea
MongoDB employs GeoJSON to store spatial data and concurrently GDAL
supports for access to features encoded in GeoJSON format, which can be
reusable.


As far as I remember, the interface with MongoDB is (was?) a kind of binary
JSON format. Has this changed ?


This project is trying to implement a MongoDB Driver according
to the OGR format driver interfaces with subclasses of 

[gdal-dev] Fwd: Re: Problem with results on two different versions of gdal_wrap

2014-03-09 Thread Stephen Woodbridge

Sorry meant to send this to the list.


 Original Message 
Subject: Re: [gdal-dev] Problem with results on two different versions 
of gdal_wrap

Date: Sun, 09 Mar 2014 15:22:23 -0400
From: Stephen Woodbridge wood...@swoodbridge.com
To: Even Rouault even.roua...@mines-paris.org

Even, Etienne,

I'm still working on this. Thanks you both for your suggestions. I spent
10-12hr yesterday trying your suggestions and variation on that to make
things work (on this on top of past efforts) with little progress.

Etienne, I tried your suggestion but it did not make a difference so we
can eliminate the hdf reader as the problem.

Since I'm dealing with a multi-step process of downloading MODIS SST
(sea surface temperatures) and mapping them to geotif for display and
query in mapserver that I inherited. I have decided to try and recreate
the process from scratch starting with trying to get documentation from
the USF IMaRS site so I can better understand what I'm working with.

1. download sst hdf
2. convert to geoTiff
3. add color LUT as .vrt
4. render image in mapserver
5. query image and report temps in degrees C and F

I'm waiting on some email response from USF on documentation for the hdf
files. And I need break from banging my head against a wall :)

To be continued ...

Thanks,
  -Steve

On 3/8/2014 5:22 AM, Even Rouault wrote:



This is the problem step:

gdalwarp -rcs -ts 8800 6600 -s_srs EPSG:32662 -t_srs EPSG:4326 temp.tif
target.tif

gdalinfo -mm -stats target.tif

is showing that the range of values in the image are dramatically
different on the two servers!

summary old:
Band 1 Block=8800x1 Type=Int16, ColorInterp=Gray
  Computed Min/Max=-3877.000,32767.000
Minimum=-3877.000, Maximum=32767.000, Mean=25235.731, StdDev=10612.642


summary new:
Band 1 Block=8800x1 Type=Int16, ColorInterp=Gray
Min=-9314.000 Max=32561.000   Computed Min/Max=-9314.000,32561.000
Minimum=-9314.000, Maximum=32561.000, Mean=19166.800, StdDev=7786.806


Ok, so you can see that the values are radically different. My question
is how do I get values like the old system? These values represent
temperatures and I need to get the same values.

My one thought on this is that if is another side effrct of proj4
behaving differently as I had to adjust the position above to get it to
align. So maybe the gdalwarp is messing up the pixel values when it
reprojects it also. But I'm totally lost on how make this work correctly.

Any thoughts on how to fix this?


Stephen,

I think we already had a discussion some time ago about differences between
spherical or ellipsoidal projections, or am I confusing with someone else ?

Well, it is not clear from your experiment if the difference is due to
reprojection or to the resampling method.

There's a difference between both GDAL versions, but is the new result worse
than the previous one (from visual inspection) ?

Cubic spline resampling seems to produce overshoot artifacts in both
situations (since -3877.000 or -9314.000 in output  377 in input). That's
probably due to the maths behind.

Maybe just try with the default nearest resampling to see if it is due to the
resampling kernel or the reprojection.

I'm also wondering if your data doesn't have a nodata value that you should
explicitely set. As I can only guess, 32767 would be a good candidate given
that the data type is Int16. But the  _FillValue=[65535] in the metadata
makes me wonder if the datatype shouldn't be UInt16 rather than Int16 in your
initial conversion from netcdf to geotiff, and the nodata would rather be 65535
?

Even





___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] Problem with results on two different versions of gdal_wrap

2014-03-09 Thread Stephen Woodbridge

On 3/8/2014 5:22 AM, Even Rouault wrote:



This is the problem step:

gdalwarp -rcs -ts 8800 6600 -s_srs EPSG:32662 -t_srs EPSG:4326 temp.tif
target.tif

gdalinfo -mm -stats target.tif

is showing that the range of values in the image are dramatically
different on the two servers!

summary old:
Band 1 Block=8800x1 Type=Int16, ColorInterp=Gray
  Computed Min/Max=-3877.000,32767.000
Minimum=-3877.000, Maximum=32767.000, Mean=25235.731, StdDev=10612.642


summary new:
Band 1 Block=8800x1 Type=Int16, ColorInterp=Gray
Min=-9314.000 Max=32561.000   Computed Min/Max=-9314.000,32561.000
Minimum=-9314.000, Maximum=32561.000, Mean=19166.800, StdDev=7786.806


Ok, so you can see that the values are radically different. My question
is how do I get values like the old system? These values represent
temperatures and I need to get the same values.

My one thought on this is that if is another side effrct of proj4
behaving differently as I had to adjust the position above to get it to
align. So maybe the gdalwarp is messing up the pixel values when it
reprojects it also. But I'm totally lost on how make this work correctly.

Any thoughts on how to fix this?


Stephen,

I think we already had a discussion some time ago about differences between
spherical or ellipsoidal projections, or am I confusing with someone else ?


Yes this is probably reloaded to the previous discussion sine this is 
the same process that we discussed before. Before I was dealing with the 
projection being misaligned and I fixed that by changing the bbox 
defined for the hdf file so it was aligned.


But now I am realizing that the pixel values are also messed up. So 
maybe changing the bbox was not the right thing to do.



Well, it is not clear from your experiment if the difference is due to
reprojection or to the resampling method.


Yeah, I am totally lost on this. My experiment was to compare the 
process steps on each system to see where things we different in the 
hope of understand what is happening.



There's a difference between both GDAL versions, but is the new result worse
than the previous one (from visual inspection) ?


The images do look similar.


Cubic spline resampling seems to produce overshoot artifacts in both
situations (since -3877.000 or -9314.000 in output  377 in input). That's
probably due to the maths behind.


Right, but turning that off and using the default does not resolve the 
issue.



Maybe just try with the default nearest resampling to see if it is due to the
resampling kernel or the reprojection.


Tried that, no joy.


I'm also wondering if your data doesn't have a nodata value that you should
explicitely set. As I can only guesswould be a good candidate given
that the data type is Int16. But the  _FillValue=[65535] in the metadata
makes me wonder if the datatype shouldn't be UInt16 rather than Int16 in your
initial conversion from netcdf to geotiff, and the nodata would rather be 65535
?


I tried setting setting nodata and tried UInt16. I notice in the hdf 
metadata that there was a valid_range=[0-32767] which might have been 
why the Int16 as being used.


Anyway, as I mentioned in the prior email, I'm waiting on some docs for 
the hdf files and I will try to reconstruct the process from that.


-Steve


Even



___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Problem with results on two different versions of gdal_wrap

2014-03-07 Thread Stephen Woodbridge

Hi All,

First off, sorry this is really long with all the gdalinfo pasted into 
it, but I thought is might be more useful to have real information to 
look at.


I have been running down a problem caused by upgrading a server to a 
newer version of gdal. Here are the details:


old server: GDAL 1.6.0dev, FWTools 2.0.6, released 2008/02/03
new server: GDAL 1.9.2, released 2012/10/08

My process is:

wget 
'http://cyclops.marine.usf.edu/modis/level3/husf/fullpass/2014/061/1km/pass/final/MODIS.2014061.194420.fullpass.rgb.hdf'


old server:
gdal_translate -of GTiff -ot Int16 -a_srs EPSG:32662 -a_ullr 
-13914936.3491592 6094982.02936303 -5009377.08569731 -560262.454244465 
HDF4_SDS:UNKNOWN:MODIS.2014061.194420.fullpass.rgb.hdf:0 temp.tif


new server:
gdal_translate -of GTiff -ot Int16 -a_srs EPSG:32662 -a_ullr 
-13914936.3491592 6115512.02936303 -5009377.08569731 -539731.454244465 
HDF4_SDS:UNKNOWN:MODIS.2014061.194420.fullpass.rgb.hdf:0 temp.tif


I have verified that gdalinfo -mm -stats temp.tif looks equivalent for 
the two generated temp.tif images. The -a_ullr are different because 
proj4 is shifting the image 20530m south so I'm correcting here for that.


summary old:
Band 1 Block=8800x1 Type=Int16, ColorInterp=Gray
  Min=377.000 Max=32767.000   Computed Min/Max=377.000,32767.000
  Minimum=377.000, Maximum=32767.000, Mean=27526.311, StdDev=11176.835

summary new:
Band 1 Block=8800x1 Type=Int16, ColorInterp=Gray
  Min=377.000 Max=32767.000   Computed Min/Max=377.000,32767.000
  Minimum=377.000, Maximum=32767.000, Mean=27526.311, StdDev=11176.835

-- full output 
old server:
# /opt/FWTools/bin_safe/gdalinfo -mm -stats temp.tif
Driver: GTiff/GeoTIFF
Files: temp.tif
Size is 8800, 6600
Coordinate System is:
PROJCS[WGS 84 / Plate Carree,
GEOGCS[WGS 84,
DATUM[WGS_1984,
SPHEROID[WGS 84,6378137,298.2572235629972,
AUTHORITY[EPSG,7030]],
AUTHORITY[EPSG,6326]],
PRIMEM[Greenwich,0],
UNIT[degree,0.0174532925199433],
AUTHORITY[EPSG,4326]],
UNIT[metre,1,
AUTHORITY[EPSG,9001]],
AUTHORITY[EPSG,32662]]
Origin = (-13914936.349159199744463,6094982.029363029636443)
Pixel Size = (1011.995370847942013,-1008.370376304165802)
Metadata:
  AREA_OR_POINT=Area
  Collection Location=USF Institute for Marine Remote Sensing
  Data Source Type=Direct Broadcast
  Input 
File(s)=/tmp/s4p_run_map_modis_2.3_2HwoEbWy/MYD021KM.2014061.194420.hdf, 
/tmp/s4p_run_map_modis_2.3_2HwoEbWy/MYD021KM.2014061.194650.hdf, 
/tmp/s4p_run_map_modis_2.3_2HwoEbWy/MYD021KM.2014061.194920.hdf, 
/tmp/s4p_run_map_modis_2.3_2HwoEbWy/MYD021KM.2014061.195150.hdf, 
/tmp/s4p_run_map_modis_2.3_2HwoEbWy/MYD021KM.2014061.195420.hdf, 
/tmp/s4p_run_map_modis_2.3_2HwoEbWy/MYD03.2014061.194420.hdf, 
/tmp/s4p_run_map_modis_2.3_2HwoEbWy/MYD03.2014061.194650.hdf, 
/tmp/s4p_run_map_modis_2.3_2HwoEbWy/MYD03.2014061.194920.hdf, 
/tmp/s4p_run_map_modis_2.3_2HwoEbWy/MYD03.2014061.195150.hdf, 
/tmp/s4p_run_map_modis_2.3_2HwoEbWy/MYD03.2014061.195420.hdf

  L3 Software Name=/usr/bin/map_modis.pl
  L3 Software Version=4.2
  Map Limit=-5, -125, 55, -45
  Map Projection Category=Proj 4.4.7
  Map Projection Name=eqc
  Proj Projection Parameters=#Equidistant Cylindrical (Plate Caree)
#   Cyl, Sph
#   lat_ts=
# +proj=eqc +ellps=WGS84
  Processing Status=final
  Satellite=aqua
  Spatial Resolution=1km
  Temporal Resolution=pass
  _FillValue=[65535]
  band_name=1
  corrected_counts_offset=-0
  corrected_counts_scale=0.124973
  corrected_counts_units=counts
  long_name=Earth View 250M Aggregated 1km Reflective Solar Bands 
Scaled Integers Band 1

  radiance_offset=-0
  radiance_scale=0.0246418
  radiance_units=Watts/m^2/micrometer/steradian
  reflectance_offset=-0
  reflectance_scale=4.72967e-05
  reflectance_units=none
  units=none
  valid_range=[0 32767]
Image Structure Metadata:
  INTERLEAVE=BAND
Corner Coordinates:
Upper Left  (-13914936.349, 6094982.029)
Lower Left  (-13914936.349, -560262.454)
Upper Right (-5009377.086, 6094982.029)
Lower Right (-5009377.086, -560262.454)
Center  (-9462156.717, 2767359.788)
Band 1 Block=8800x1 Type=Int16, ColorInterp=Gray
Computed Min/Max=377.000,32767.000
  Minimum=377.000, Maximum=32767.000, Mean=27526.311, StdDev=11176.835
  Metadata:
STATISTICS_MINIMUM=377
STATISTICS_MAXIMUM=32767
STATISTICS_MEAN=27526.311392114
STATISTICS_STDDEV=11176.834697473


new server:
# gdalinfo -mm -stats temp.tif
Driver: GTiff/GeoTIFF
Files: temp.tif
   temp.tif.aux.xml
Size is 8800, 6600
Coordinate System is:
LOCAL_CS[WGS 84 / Plate Carree,
GEOGCS[WGS 84,
DATUM[WGS_1984,
SPHEROID[WGS 84,6378137,298.257223563,
AUTHORITY[EPSG,7030]],
AUTHORITY[EPSG,6326]],
PRIMEM[Greenwich,0],
UNIT[degree,0.0174532925199433],
AUTHORITY[EPSG,4326]],
AUTHORITY[EPSG,32662],
UNIT[metre,1]]
Origin = 

Re: [gdal-dev] GSoC 2014

2014-02-17 Thread Stephen Woodbridge

Mikhail,

This is a very interesting idea.

You might want to add specific support pgRouting which already supports 
building graphs, creating routing topology and solving various graph 
problem using postgis for geometry and tables for building and linking 
the topology.


You might also want to look at:

https://github.com/woodbri/osrm-tools
  - a tool to move pgrouting data to Project-OSRM

https://github.com/DennisOSRM/Project-OSRM
  - a high performance routing engine

A tool like you are describing that can load data and prepare a graph or 
move data between these projects would be extremely useful.


I look forward to hearing more about your ideas.

Thanks,
  -Steve Woodbridge

On 2/17/2014 5:13 AM, Михаил Гусев wrote:

Hello everyone.

I am a last year student at Moscow Power Engineering Institute, Russia.
For GSoC 2014 I would like to work on networking capabilities in GDAL/OGR.

_
_

_Overall idea_

I would like to try to implement a universal network model. The
universality of the model would reflect not only in the ability to use
different GIS formats to store and transfer network data (which OGR is a
grate basis for), but also to be able to design and simulate different
types of network applications (engineering, natural, etc). I understand
that it is rather ambitious to consider all possible aspects of all
network types. But as a first step I would like to cover only the basic
aspects, that can be generalized and to provide a “platform” which can
be used by other developers to create their own extensions which are
specific to the concrete network type.


_Target scope for GSoC 2014_

Today none of OGR drivers supports network functionality. The idea is to
implement a new OGR driver, which will deal with networks built over the
spatial data. The spatial data together with the network data will be
stored in one of the OGR-supported formats, which would be specified
during the creation of the network.

Planned features of the new driver:

1. Reading/writing data from/to the source, creating layers, editing
attributes, etc (as any new OGR driver must provide);

2. The user would also be able to do the following, assuming that all
the special network data is the data of the current GIS-format:


- Read/Write the information about the whole network (network metadata);

- Edit special network objects parameters such as blocking state, or
direction of the flow;

- Import objects from the external sources (the driver adds missing
network parameters to them);

- Set/unset connections among network objects;

- Edit the sets of the network rules.

3. Each object in the network will have a set of relations with other
objects, which are stored separately and form a network graph;

4. The whole network will have a set of rules that describes the
connection possibilities of different object types in the network. The
network will also have a set of rules that describes the influence of
each type of object to the state of the whole network and to the other
types of objects. It can also be called as a behavior of the object in
the context of the network specialization (engineering purpose).


_Current status_

https://github.com/MikhanGusev/gnm

As a matter of fact I've been working on this for a while, and I've
already completed few things. Here is what I've accomplished already:

1. Complete. The driver and most of the required virtual interfaces are
already implemented;

2. Complete. All data from the external source is wrapped with the
network proxy and the user can edit it.

3. Currently, I'm thinking about the way, how the network graph will be
stored.


Best regards,

Mikhail Gusev.



___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev



___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] gdalwarp EPSG:32662 problem

2014-02-08 Thread Stephen Woodbridge

On 2/8/2014 2:55 AM, Jean-Claude Repetto wrote:

On 08/02/2014 07:11, Stephen Woodbridge wrote:


I've run into a problem that I think is related to a change in the EPSG
definition for EPSG:32662

The symptom is that the image after gdalwarp reprojection is about 20km
south of where it should be.


Please post the gdalwarp command you have used. (I mean, with both CRS).
The problem is likely on the other CRS.



gdalwarp -rcs -ts 8800 6600 -s_srs EPSG:32662 -t_srs EPSG:4326 
$temptiff1 $target


And the image in question is:

root@u17269306:/maps/images.tmp# gdalinfo 
MODIS.2014038.193758.fullpass.seadas_sst.hdf

Driver: HDF4/Hierarchical Data Format Release 4
Files: MODIS.2014038.193758.fullpass.seadas_sst.hdf
Size is 512, 512
Coordinate System is `'
Metadata:
  Collection Location=USF Institute for Marine Remote Sensing
  Data Source Type=Direct Broadcast
  Input File(s)=/tmp/s4p_run_map_modis_2.3_WSthzYwz/A2014038193749.GEO, 
/tmp/s4p_run_map_modis_2.3_WSthzYwz/A2014038193749.L2_LAC_SST, 
/tmp/s4p_run_map_modis_2.3_WSthzYwz/A2014038194019.GEO, 
/tmp/s4p_run_map_modis_2.3_WSthzYwz/A2014038194019.L2_LAC_SST, 
/tmp/s4p_run_map_modis_2.3_WSthzYwz/A2014038194249.GEO, 
/tmp/s4p_run_map_modis_2.3_WSthzYwz/A2014038194249.L2_LAC_SST, 
/tmp/s4p_run_map_modis_2.3_WSthzYwz/A2014038194519.GEO, 
/tmp/s4p_run_map_modis_2.3_WSthzYwz/A2014038194519.L2_LAC_SST, 
/tmp/s4p_run_map_modis_2.3_WSthzYwz/A2014038194749.GEO, 
/tmp/s4p_run_map_modis_2.3_WSthzYwz/A2014038194749.L2_LAC_SST

  L3 Software Name=/usr/bin/map_modis.pl
  L3 Software Version=4.2
  Map Limit=-5, -125, 55, -45
  Map Projection Category=Proj 4.4.7
  Map Projection Name=eqc
  Processing Status=intermediate
  Proj Projection Parameters=#Equidistant Cylindrical (Plate Caree)
#   Cyl, Sph
#   lat_ts=
# +proj=eqc +ellps=WGS84
  Satellite=aqua
  Spatial Resolution=1km
  Temporal Resolution=pass
Subdatasets:

SUBDATASET_1_NAME=HDF4_SDS:UNKNOWN:MODIS.2014038.193758.fullpass.seadas_sst.hdf:0
  SUBDATASET_1_DESC=[6600x8800] seadas_sst (16-bit integer)

SUBDATASET_2_NAME=HDF4_SDS:UNKNOWN:MODIS.2014038.193758.fullpass.seadas_sst.hdf:1
  SUBDATASET_2_DESC=[6600x8800] l2_flags (32-bit integer)
Corner Coordinates:
Upper Left  (0.0,0.0)
Lower Left  (0.0,  512.0)
Upper Right (  512.0,0.0)
Lower Right (  512.0,  512.0)
Center  (  256.0,  256.0)
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] How to fix gdalwarp warning?

2014-02-08 Thread Stephen Woodbridge

On 2/8/2014 6:44 PM, Even Rouault wrote:

Le vendredi 07 février 2014 16:07:46, Jukka Rahkonen a écrit :

Stephen Woodbridge woodbri at swoodbridge.com writes:

How does one do Converting the dataset  prior to 24/32 bit is advised.?


For example with http://www.gdal.org/pct2rgb.html


gdal_translate -expand rgb or rgba is an alternative.


Thanks Even, I found that after Jukka's post and used that instead.
I appreaciate all the help from the list. You guys are great!

-Steve


___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] How to fix gdalwarp warning?

2014-02-07 Thread Stephen Woodbridge

How does one do Converting the dataset  prior to 24/32 bit is advised.?

Warning: Input file 
/maps/images.tmp/844219fb7ca86093d82b14376b56b875/modis-chlo
ra-201402071850.tif has a color table, which will likely lead to bad 
results whe
n using a resampling method other than nearest neighbour. Converting the 
dataset

 prior to 24/32 bit is advised.

Thanks,
  -Steve
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] gdalwarp EPSG:32662 problem

2014-02-07 Thread Stephen Woodbridge

Hi,

Sorry for cross posting. This seems like a proj4 issue but might be gdal 
related.


I've run into a problem that I think is related to a change in the EPSG 
definition for EPSG:32662


The symptom is that the image after gdalwarp reprojection is about 20km 
south of where it should be.


old system is correctly aligned:

GDAL 1.4.2.0, released 2007/06/27
proj Rel. 4.5.0, 22 Oct 2006

# WGS 84 / Plate Carree
32662 +proj=eqc +lat_ts=0 +lon_0=0 +x_0=0 +y_0=0 +ellps=WGS84 
+datum=WGS84 +units=m +no_defs  


new system is shifted about 20km south:

GDAL 1.9.2, released 2012/10/08
proj Rel. 4.8.0, 6 March 2012

# WGS 84 / Plate Carree (deprecated)
32662 +proj=eqc +lat_ts=0 +lat_0=0 +lon_0=0 +x_0=0 +y_0=0 +datum=WGS84 
+units=m +no_defs  



http://spatialreference.org/ref/epsg/32662/
This indicates that the definition in proj 4.8.0 is wrong. Would the 
+ellps=WGS84 account for this shift?


Any thoughts on this?

Thanks,
  -Steve
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] Large shapefile issues

2013-11-25 Thread Stephen Woodbridge
Unless something has changed, I have never been able to work with dbf 
file over 2GB using shapelib.


-Steve W

On 11/25/2013 5:52 PM, Even Rouault wrote:

Le lundi 25 novembre 2013 11:42:23, CARMAN, Darren a écrit :

Hi List



I notice on the OGR formats page for ESRI Shapefile the following is
mentioned:



Size Issues

Geometry: The Shapefile format explicitly uses 32bit offsets and so
cannot go over 8GB (it actually uses 32bit offsets to 16bit words).
Hence, it is is not recommended to use a file size over 4GB.

Attributes: The dbf format does not have any offsets in it, so it can be
arbitrarily large.





Yet on the ESRI website:



Geometry limitations

There is a 2 GB size limit for any shapefile component file, which
translates to a maximum of roughly 70 million point features. The actual
number of line or polygon features you can store in a shapefile depends
on the number of vertices in each line or polygon (a vertex is
equivalent to a point).





I assume the OGR web page is wrong, or has a different meaning outside
of ESRI S/W use.


Darren,

Yes, as underlined by Chaitanya the actual limit depends on the software
implementation. Actually the limit in OGR was 4 GB for the .SHP, and AFAICS
unlimited for DBF.

I've added in http://trac.osgeo.org/gdal/changeset/26657 a 2GB_LIMIT=YES layer
creation option (and SHAPE_2GB_LIMIT configuration option) that will enforce
the 2GB limit. And in http://trac.osgeo.org/gdal/changeset/26658 a change so
that when the layer is reached the file is properly closed with the valid
information.

Spliting files over several files would properly need to be done outside of
this, in a script, by restarting from the source layer at the index next to
the one that was last written in the shapefile.

Best regards,

Even



___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Slow convertion from OSM to PG with -skipfailures

2013-05-27 Thread Stephen Woodbridge

Even,

Would it make sense to create and load a stored procedure that you use 
to load the feature with, then you can trap exceptions and ignore them. 
I would think that this would be much faster. Well the stored procedure 
is run in a transaction bout you could do something like:


begin;
select loadfeatureskipfailures(args) from features where gid between 
gid_from and gid_to;

commit;

And run this in a loop incrementing gid_from and gid_to over your range 
of features.


I would think that something along these lines would be faster than 
doing every insert in a separate transaction.


-Steve W

On 5/27/2013 11:12 AM, Even Rouault wrote:

Selon Jukka Rahkonen jukka.rahko...@mmmtike.fi:


Hi,

I have measured a huge speed difference by running the same conversion from
OSM data file into PostGIS either with or without using the -skipfailures
parameter.

Without -skipfailures conversion takes about two minutes but if I add the
parameter it takes at least two hours. The command I used is this

ogr2ogr -f PostgreSQL PG:dbname='gis' host='server' port='4326' user='user'
password='passwd' finland.osm.pbf -gt 2 -progress
--config OSM_COMPRESS_NODES YES  -lco DIM=2 -lco geometry_name=geoloc
-lco fid=fid -skipfailures  --config PG_USE_COPY YES

I wonder if it really needs to be so slow. My guess is that -skipfailures
somehow invalidates my -gt 2 parameter.


Yes, it does. In -skipfailures mode, the transaction must be per feature, so
that a failure on a feature doesn't impact other features.



-Jukka Rahkonen-

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev




___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev



___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] Slow convertion from OSM to PG with -skipfailures

2013-05-27 Thread Stephen Woodbridge

On 5/27/2013 3:06 PM, Even Rouault wrote:

Le lundi 27 mai 2013 20:52:57, Stephen Woodbridge a écrit :

Even,

Would it make sense to create and load a stored procedure that you use
to load the feature with, then you can trap exceptions and ignore them.
I would think that this would be much faster. Well the stored procedure
is run in a transaction bout you could do something like:

begin;
select loadfeatureskipfailures(args) from features where gid between
gid_from and gid_to;
commit;

And run this in a loop incrementing gid_from and gid_to over your range
of features.

I would think that something along these lines would be faster than
doing every insert in a separate transaction.


I'm afraid I'm not familiar enough with PG stored procedures to really assess
the feasability of your suggestion. What is the features table in your example
? How is it populated ? And I'm a bit skeptical that this could catch and
recover errors in a better way than doing it from the outside.



Hi Even,

My reference to features was a generic reference to the incoming stream 
of feature to be loaded.


In a plpgsql function you can do something like this:

...
BEGIN
  INSERT 
EXCEPTION
  WHEN SQLSTATE NOT '0' THEN
RAISE NOTICE 'Exception % caught, ignoring', SQLSTATE;
-- NOOP
END;
...

http://www.postgresql.org/docs/9.0/interactive/plpgsql-control-structures.html#PLPGSQL-ERROR-TRAPPING

Anyway, my idea is that the function can recover from the error, so if 
you can feed your data in chucks through the function that can ignore 
errors that this might be faster than


BEGIN;
INSERT ...
COMMIT;

for every feature, but it might not be because a plpgsql function 
effectively runs inside of an implicit transaction. I think it would be 
faster if you could do something like:


BEGIN:
SELECT loadfeature(args);
SELECT loadfeature(args);
SELECT loadfeature(args);
...
SELECT loadfeature(args);
SELECT loadfeature(args);
SELECT loadfeature(args);
COMMIT;

Since none of the loadfeature() calls will fail if you trap the 
exceptions that you are interested in ignoring. This would allow you to 
insert a block of feature in a transaction and ignore any failures.


Without some testing I can't be sure of the performance implication.

-Steve W
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] PostGIS date field output

2013-02-28 Thread Stephen Woodbridge

On 2/28/2013 4:40 PM, Kralidis,Tom [Ontario] wrote:

Hi: we are using 1.9.2 via MapServer WFS to serve out PostGIS data in
GeoJSON, which works very well -- great feature!

I was having issues with GeoJSON output on date fields when hour is 
10.

Digging deeper:

The column in question is of PostgreSQL type 'timestamp without time
zone'.  When querying the data in psql, I get:

mydatetime: 2013-02-19 01:03:34

When I do an ogrinfo the same is shown as:

mydatetime (DateTime) = 2013/02/19  1:03:34

The month is zero padded, but not the hour.  I would have expected it to
be consistent one way or the other.   Naturally, any format which OGR
writes to carries this format.

Our downstream UI (in this case) has to implement a workaround, as it
expects zero padded hour values.

Is this by design in GDAL/OGR?  Is there any we can do to output hour
values zero padded in this case?  I'd rather not have the front end have
to fix this, and using MapServer WFS out of the box leaves us little
options for custom scripting.


Tom,

Can you cast it to a string in sql so it gets passed through a text?

-Steve W
___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


[gdal-dev] Virtual Filesystem Question(s)

2013-02-09 Thread Stephen Woodbridge

Hi Even,

Frank pointed me to your blog entry on the GDAL virtual file system at:

http://erouault.blogspot.com/2012/05/new-gdal-virtual-file-system-to-read.html

This is most excellent. I am trying to get this to work using the 
/vsicurl/ option to connect to a php page that generates CSV output, But 
it is not sucessful :(


I was hoping you could help me sort out how to get this to work.

Running GDAL 1.9.0

woodbri@mappy:/u/software/gdal-1.9.0$ ./config.status -V
config.status
configured by ./configure, generated by GNU Autoconf 2.68,
  with options '--prefix=/usr/local' '--mandir=/usr/local/share/man' 
'--includedir=/usr/local/include/gdal' '--with-threads' '--with-hdf5=no' 
'--with-grass=no' '--with-ecw=no' '--with-mrsid=no' '--with-jp2mrsid=no' 
'--with-libtiff=internal' '--with-geotiff=internal' '--with-jasper' 
'--with-netcdf' '--with-xerces' '--with-geos' '--with-sqlite3' 
'--with-curl' '--with-pg' '--with-mysql' '--with-perl' '--with-python' 
'--with-ecw' '--with-mrsid=/u/software/Geo_DSDK-7.0.0.2167' 
'--with-fgdb=/u/software/FileGDB_API' '--with-cfitsio=no'


I have created a file test.csv and copied the output of the php script 
into and created a test.vrt and ogrinfo is happy with it.


I change test.vrt to

SrcDataSource/vsicurl/http://example.com/ws/test-pg.php/SrcDataSource

This is happy:

woodbri@mappy:~/work/mongo/ogr$ ogrinfo -ro test-pg.vrt
INFO: Open of `test-pg.vrt'
  using driver `VRT' successful.
1: test (Point)

But this is not:

woodbri@mappy:~/work/mongo/ogr$ ogrinfo -ro test-pg.vrt test
INFO: Open of `test-pg.vrt'
  using driver `VRT' successful.
ERROR 1: Failed to open datasource 
`/vsicurl/http://example.com/ws/test-pg.php'.


Layer name: test
Geometry: Point
Feature Count: 0
Layer SRS WKT:
GEOGCS[WGS 84,
DATUM[WGS_1984,
SPHEROID[WGS 84,6378137,298.257223563,
AUTHORITY[EPSG,7030]],
TOWGS84[0,0,0,0,0,0,0],
AUTHORITY[EPSG,6326]],
PRIMEM[Greenwich,0,
AUTHORITY[EPSG,8901]],
UNIT[degree,0.0174532925199433,
AUTHORITY[EPSG,9108]],
AUTHORITY[EPSG,4326]]

Doing some additional work like find a php script the simulates byte 
range downloading. and turning on some debuging, it looks like I have 
run into a but in OGR.


So if you notice below, OGR gets the file size and then asks for more 
bytes then are in the file. The whole file is sent but OGR get the 200 
response code but (I guessing) because the size returned is smaller then 
the size requested, it repeats the request over and over.


I'll grab the 1.9.2 tarball and see if I can build that and get better 
results. If your php request is doing a sql query (like I'm doing) then 
this is going to be problematic if the results exceed a single request, 
because you can not be assured that the subsequent request for the next 
part of the file will generate the exact same query results. It would be 
very helpful if there is a mode or virtual driver where OGR reads the 
whole data stream and buffers it into a tmp file or whatever then deals 
with it, because there is no way for the server to put the request into 
a tempfile and then relate it to subsequent requests from that client 
versus other clients and know when to clean things up. It makes more 
sense for the client to handle these issues because it knows when it is 
done with the resource.


Any thoughts or ideas on all of this?

Thanks,
  -Steve

woodbri@mappy:~/work/mongo/ogr$ ogrinfo --debug on -ro test-pg.vrt test
OGR: OGROpen(test-pg.vrt/0x177c3a0) succeeded as VRT.
INFO: Open of `test-pg.vrt'
  using driver `VRT' successful.
OGR: GetLayerCount() = 1

VSICURL: GetFileList(/vsicurl/http://imaptools.com:8080/ws)
VSICURL: File[0] = geoip.php, is_dir = 0, size = 0, time = 2013/02/08 
15:44:00
VSICURL: File[1] = test-pg.php, is_dir = 0, size = 0, time = 2013/02/09 
15:55:00

VSICURL: Downloading 0-16383 (http://imaptools.com:8080/ws/test-pg.php)...
VSICURL: Got reponse_code=200
VSICURL: GetFileSize(http://imaptools.com:8080/ws/test-pg.php)=13053 
response_code=200

VSICURL: Downloading 0-16383 (http://imaptools.com:8080/ws/test-pg.php)...
VSICURL: Got reponse_code=200
VSICURL: Downloading 0-16383 (http://imaptools.com:8080/ws/test-pg.php)...
VSICURL: Got reponse_code=200
VSICURL: Downloading 0-16383 (http://imaptools.com:8080/ws/test-pg.php)...
VSICURL: Got reponse_code=200
VSICURL: Downloading 0-16383 (http://imaptools.com:8080/ws/test-pg.php)...
VSICURL: Got reponse_code=200
VSICURL: Downloading 0-16383 (http://imaptools.com:8080/ws/test-pg.php)...
VSICURL: Got reponse_code=200
VSICURL: Downloading 0-16383 (http://imaptools.com:8080/ws/test-pg.php)...
VSICURL: Got reponse_code=200
VSICURL: Downloading 0-16383 (http://imaptools.com:8080/ws/test-pg.php)...
^C

I logged the php code and here is a partial output:

header: Accept-Ranges: 0-13053
request: HTTP_RANGE: bytes=0-16383
header: Content-Range: bytes 0-13052/13053
header: Content-Length: 13053

header: Accept-Ranges: 0-13053
request: 

Re: [gdal-dev] Virtual Filesystem Question(s)

2013-02-09 Thread Stephen Woodbridge
Just a followup to this, I have downloaded gdal-1.9.2 and get the same 
results. So no joy there.


-Steve

On 2/9/2013 4:28 PM, Stephen Woodbridge wrote:

Hi Even,

Frank pointed me to your blog entry on the GDAL virtual file system at:

http://erouault.blogspot.com/2012/05/new-gdal-virtual-file-system-to-read.html


This is most excellent. I am trying to get this to work using the
/vsicurl/ option to connect to a php page that generates CSV output, But
it is not sucessful :(

I was hoping you could help me sort out how to get this to work.

Running GDAL 1.9.0

woodbri@mappy:/u/software/gdal-1.9.0$ ./config.status -V
config.status
configured by ./configure, generated by GNU Autoconf 2.68,
   with options '--prefix=/usr/local' '--mandir=/usr/local/share/man'
'--includedir=/usr/local/include/gdal' '--with-threads' '--with-hdf5=no'
'--with-grass=no' '--with-ecw=no' '--with-mrsid=no' '--with-jp2mrsid=no'
'--with-libtiff=internal' '--with-geotiff=internal' '--with-jasper'
'--with-netcdf' '--with-xerces' '--with-geos' '--with-sqlite3'
'--with-curl' '--with-pg' '--with-mysql' '--with-perl' '--with-python'
'--with-ecw' '--with-mrsid=/u/software/Geo_DSDK-7.0.0.2167'
'--with-fgdb=/u/software/FileGDB_API' '--with-cfitsio=no'

I have created a file test.csv and copied the output of the php script
into and created a test.vrt and ogrinfo is happy with it.

I change test.vrt to

SrcDataSource/vsicurl/http://example.com/ws/test-pg.php/SrcDataSource

This is happy:

woodbri@mappy:~/work/mongo/ogr$ ogrinfo -ro test-pg.vrt
INFO: Open of `test-pg.vrt'
   using driver `VRT' successful.
1: test (Point)

But this is not:

woodbri@mappy:~/work/mongo/ogr$ ogrinfo -ro test-pg.vrt test
INFO: Open of `test-pg.vrt'
   using driver `VRT' successful.
ERROR 1: Failed to open datasource
`/vsicurl/http://example.com/ws/test-pg.php'.

Layer name: test
Geometry: Point
Feature Count: 0
Layer SRS WKT:
GEOGCS[WGS 84,
 DATUM[WGS_1984,
 SPHEROID[WGS 84,6378137,298.257223563,
 AUTHORITY[EPSG,7030]],
 TOWGS84[0,0,0,0,0,0,0],
 AUTHORITY[EPSG,6326]],
 PRIMEM[Greenwich,0,
 AUTHORITY[EPSG,8901]],
 UNIT[degree,0.0174532925199433,
 AUTHORITY[EPSG,9108]],
 AUTHORITY[EPSG,4326]]

Doing some additional work like find a php script the simulates byte
range downloading. and turning on some debuging, it looks like I have
run into a but in OGR.

So if you notice below, OGR gets the file size and then asks for more
bytes then are in the file. The whole file is sent but OGR get the 200
response code but (I guessing) because the size returned is smaller then
the size requested, it repeats the request over and over.

I'll grab the 1.9.2 tarball and see if I can build that and get better
results. If your php request is doing a sql query (like I'm doing) then
this is going to be problematic if the results exceed a single request,
because you can not be assured that the subsequent request for the next
part of the file will generate the exact same query results. It would be
very helpful if there is a mode or virtual driver where OGR reads the
whole data stream and buffers it into a tmp file or whatever then deals
with it, because there is no way for the server to put the request into
a tempfile and then relate it to subsequent requests from that client
versus other clients and know when to clean things up. It makes more
sense for the client to handle these issues because it knows when it is
done with the resource.

Any thoughts or ideas on all of this?

Thanks,
   -Steve

woodbri@mappy:~/work/mongo/ogr$ ogrinfo --debug on -ro test-pg.vrt test
OGR: OGROpen(test-pg.vrt/0x177c3a0) succeeded as VRT.
INFO: Open of `test-pg.vrt'
   using driver `VRT' successful.
OGR: GetLayerCount() = 1

VSICURL: GetFileList(/vsicurl/http://imaptools.com:8080/ws)
VSICURL: File[0] = geoip.php, is_dir = 0, size = 0, time = 2013/02/08
15:44:00
VSICURL: File[1] = test-pg.php, is_dir = 0, size = 0, time = 2013/02/09
15:55:00
VSICURL: Downloading 0-16383 (http://imaptools.com:8080/ws/test-pg.php)...
VSICURL: Got reponse_code=200
VSICURL: GetFileSize(http://imaptools.com:8080/ws/test-pg.php)=13053
response_code=200
VSICURL: Downloading 0-16383 (http://imaptools.com:8080/ws/test-pg.php)...
VSICURL: Got reponse_code=200
VSICURL: Downloading 0-16383 (http://imaptools.com:8080/ws/test-pg.php)...
VSICURL: Got reponse_code=200
VSICURL: Downloading 0-16383 (http://imaptools.com:8080/ws/test-pg.php)...
VSICURL: Got reponse_code=200
VSICURL: Downloading 0-16383 (http://imaptools.com:8080/ws/test-pg.php)...
VSICURL: Got reponse_code=200
VSICURL: Downloading 0-16383 (http://imaptools.com:8080/ws/test-pg.php)...
VSICURL: Got reponse_code=200
VSICURL: Downloading 0-16383 (http://imaptools.com:8080/ws/test-pg.php)...
VSICURL: Got reponse_code=200
VSICURL: Downloading 0-16383 (http://imaptools.com:8080/ws/test-pg.php)...
^C

I logged the php code and here is a partial output:

header: Accept-Ranges: 0-13053
request: HTTP_RANGE: bytes

Re: [gdal-dev] Virtual Filesystem Question(s)

2013-02-09 Thread Stephen Woodbridge

On 2/9/2013 6:03 PM, Jukka Rahkonen wrote:

Stephen Woodbridge woodbri at swoodbridge.com writes:



Just a followup to this, I have downloaded gdal-1.9.2 and get the same
results. So no joy there.

-Steve




Perhaps vsicurl_streaming that is also mentioned in the same blog is something
worth trying.


Yeah, I saw that. I may or may not be able to build trunk on this 
machine depending on the dependencies that needs as more of the 
libraries are fairly dated.


But more importantly, I think this is a bug. I'm happy to write it up 
but it would be nice to get some confirmation. At the moment, I'm review 
the source code to see if I can see a problem there.


Thank you for your suggestion,
  -Steve

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] Virtual Filesystem Question(s)

2013-02-09 Thread Stephen Woodbridge

On 2/9/2013 6:41 PM, Even Rouault wrote:

Le dimanche 10 février 2013 00:23:37, Stephen Woodbridge a écrit :

On 2/9/2013 6:03 PM, Jukka Rahkonen wrote:

Stephen Woodbridge woodbri at swoodbridge.com writes:

Just a followup to this, I have downloaded gdal-1.9.2 and get the same
results. So no joy there.

-Steve


Perhaps vsicurl_streaming that is also mentioned in the same blog is
something worth trying.


Yeah, I saw that. I may or may not be able to build trunk on this
machine depending on the dependencies that needs as more of the
libraries are fairly dated.

But more importantly, I think this is a bug. I'm happy to write it up
but it would be nice to get some confirmation. At the moment, I'm review
the source code to see if I can see a problem there.


Well, I rather think that there's a bug in your PHP script, because curl -r
0-16383 http://www.google.fr works even if the returned file is  16383

whereas :

curl -r 0-16383 http://example.com/ws/test-pg.php hangs

Or it is a bug in libcurl itself.

I haven't checked the RFCs, but all tests I have done before against Apache or
IIS servers show that asking for more than the file size work.


Thanks, I will try 1.10 beta and also look at the php script. It is 
possible that I'm not issuing all the headers correctly.


I trying to find an easy to read client-server dialog of what should be 
happening with regards to the range requests. RFC 2616 covers it but it 
is hard for me to piece together all the bits and pieces from this:


http://tools.ietf.org/html/rfc2616

I appreciate your quick response. I'll continue digging in to this and 
let you how it goes.


Thanks,
  -Steve

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev


Re: [gdal-dev] Virtual Filesystem Question(s)

2013-02-09 Thread Stephen Woodbridge

On 2/9/2013 6:41 PM, Even Rouault wrote:

Le dimanche 10 février 2013 00:23:37, Stephen Woodbridge a écrit :

On 2/9/2013 6:03 PM, Jukka Rahkonen wrote:

Stephen Woodbridge woodbri at swoodbridge.com writes:

Just a followup to this, I have downloaded gdal-1.9.2 and get the same
results. So no joy there.

-Steve


Perhaps vsicurl_streaming that is also mentioned in the same blog is
something worth trying.


Yeah, I saw that. I may or may not be able to build trunk on this
machine depending on the dependencies that needs as more of the
libraries are fairly dated.

But more importantly, I think this is a bug. I'm happy to write it up
but it would be nice to get some confirmation. At the moment, I'm review
the source code to see if I can see a problem there.


Well, I rather think that there's a bug in your PHP script, because curl -r
0-16383 http://www.google.fr works even if the returned file is  16383

whereas :

curl -r 0-16383 http://example.com/ws/test-pg.php hangs

Or it is a bug in libcurl itself.


OK, bug in my code. So using the above command now works. But ogrinfo 
alternately hangs and then runs with this error:


ERROR 1: Failed to open datasource 
`/vsicurl/http://example.com/ws/test-pg.php'.


In gdb, it appears to be hanging if I ^C and bt in poll()

(gdb) run -ro test-pg.vrt test
Starting program: /usr/local/bin/ogrinfo -ro test-pg.vrt test
[Thread debugging using libthread_db enabled]
warning: Lowest section in /usr/lib/libicudata.so.38 is .hash at 
0120

INFO: Open of `test-pg.vrt'
  using driver `VRT' successful.
[New Thread 0x7f35a95aa710 (LWP 13410)]
^C
Program received signal SIGINT, Interrupt.
[Switching to Thread 0x7f35a95aa710 (LWP 13410)]
0x7f35a32efb9f in poll () from /lib/libc.so.6
(gdb) bt
#0  0x7f35a32efb9f in poll () from /lib/libc.so.6
#1  0x7f35a51b7296 in ?? () from /usr/lib/libcurl.so.4
#2  0x7f35a51afbc1 in ?? () from /usr/lib/libcurl.so.4
#3  0x7f35a51afefd in ?? () from /usr/lib/libcurl.so.4
#4  0x7f35a519ea3b in ?? () from /usr/lib/libcurl.so.4
#5  0x7f35a51ac80b in ?? () from /usr/lib/libcurl.so.4
#6  0x7f35a8ca7f5c in VSICurlFilesystemHandler::GetFileList (
this=0x17b0310,
pszDirname=0x17b28e8 /vsicurl/http://example.com/ws;,
pbGotFileList=0x17b2850) at cpl_vsil_curl.cpp:2314
#7  0x7f35a8cab19b in VSICurlFilesystemHandler::ReadDir (this=0x17b0310,
pszDirname=value optimized out, pbGotFileList=0x7fff120d38f8)
at cpl_vsil_curl.cpp:2461
#8  0x7f35a8caa88b in VSICurlFilesystemHandler::Stat (this=0x17b0310,
pszFilename=0x17b2730 /vsicurl/http://example.com/ws/test-pg.php;,
pStatBuf=value optimized out, nFlags=3) at cpl_vsil_curl.cpp:2367
#9  0x7f35a8cb1c39 in VSIStatExL (
pszFilename=0x17b2730 /vsicurl/http://example.com/ws/test-pg.php;,
psStatBuf=0x7fff120d39b0, nFlags=3) at cpl_vsil.cpp:285
#10 0x7f35a8e73f44 in OGRShapeDataSource::Open (this=0x17b27a0,
pszNewName=0x17b2730 /vsicurl/http://example.com/ws/test-pg.php;,
bUpdate=0, bTestOpen=1, bForceSingleFileDataSource=0)
at ogrshapedatasource.cpp:109
#11 0x7f35a8e74e82 in OGRShapeDriver::Open (this=value optimized out,
pszFilename=0x17b2730 /vsicurl/http://example.com/ws/test-pg.php;,
bUpdate=0) at ogrshapedriver.cpp:70
#12 0x7f35a8e7272c in OGRSFDriverRegistrar::Open (
pszName=0x17b2730 /vsicurl/http://example.com/ws/test-pg.php;,
bUpdate=0, ppoDriver=0x0) at ogrsfdriverregistrar.cpp:226
#13 0x7f35a8e966f7 in OGRVRTLayer::FullInitialize (this=0x17b1530)
at ogrvrtlayer.cpp:322
#14 0x7f35a8e97450 in OGRVRTLayer::GetLayerDefn (this=0x17b1530)
at ogrvrtlayer.cpp:1864
#15 0x0040195d in ReportOnLayer (poLayer=0x7fff120d2f50,
pszWHERE=0x1 Address 0x1 out of bounds, poSpatialFilter=0x493c0)
at ogrinfo.cpp:360
#16 0x004023f6 in main (nArgc=value optimized out,
papszArgv=0x17b0ae0) at ogrinfo.cpp:316
(gdb) c
Continuing.

# after a very long wait 3-5 minutes it times out with the following

ERROR 1: Failed to open datasource 
`/vsicurl/http://example.com/ws/test-pg.php'.


Layer name: test
Geometry: 3D Point
Feature Count: 0
Layer SRS WKT:
GEOGCS[WGS 84,
DATUM[WGS_1984,
SPHEROID[WGS 84,6378137,298.257223563,
AUTHORITY[EPSG,7030]],
TOWGS84[0,0,0,0,0,0,0],
AUTHORITY[EPSG,6326]],
PRIMEM[Greenwich,0,
AUTHORITY[EPSG,8901]],
UNIT[degree,0.0174532925199433,
AUTHORITY[EPSG,9108]],
AUTHORITY[EPSG,4326]]

Program exited normally.

OK, got 1.10beta1 compiled I'll give that a try. This might just be my 
very old version of curl 7.18.2


-Steve


I haven't checked the RFCs, but all tests I have done before against Apache or
IIS servers show that asking for more than the file size work.



Thank you for your suggestion,
-Steve

___
gdal-dev mailing list
gdal-dev@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/gdal-dev

Re: [gdal-dev] Virtual Filesystem Question(s)

2013-02-09 Thread Stephen Woodbridge

On 2/9/2013 8:24 PM, Stephen Woodbridge wrote:

On 2/9/2013 6:41 PM, Even Rouault wrote:

Le dimanche 10 février 2013 00:23:37, Stephen Woodbridge a écrit :

On 2/9/2013 6:03 PM, Jukka Rahkonen wrote:

Stephen Woodbridge woodbri at swoodbridge.com writes:

Just a followup to this, I have downloaded gdal-1.9.2 and get the same
results. So no joy there.

-Steve


Perhaps vsicurl_streaming that is also mentioned in the same blog is
something worth trying.


Yeah, I saw that. I may or may not be able to build trunk on this
machine depending on the dependencies that needs as more of the
libraries are fairly dated.

But more importantly, I think this is a bug. I'm happy to write it up
but it would be nice to get some confirmation. At the moment, I'm review
the source code to see if I can see a problem there.


Well, I rather think that there's a bug in your PHP script, because
curl -r
0-16383 http://www.google.fr works even if the returned file is  16383

whereas :

curl -r 0-16383 http://example.com/ws/test-pg.php hangs

Or it is a bug in libcurl itself.


OK, bug in my code. So using the above command now works. But ogrinfo
alternately hangs and then runs with this error:

ERROR 1: Failed to open datasource
`/vsicurl/http://example.com/ws/test-pg.php'.

In gdb, it appears to be hanging if I ^C and bt in poll()

(gdb) run -ro test-pg.vrt test
Starting program: /usr/local/bin/ogrinfo -ro test-pg.vrt test
[Thread debugging using libthread_db enabled]
warning: Lowest section in /usr/lib/libicudata.so.38 is .hash at
0120
INFO: Open of `test-pg.vrt'
   using driver `VRT' successful.
[New Thread 0x7f35a95aa710 (LWP 13410)]
^C
Program received signal SIGINT, Interrupt.
[Switching to Thread 0x7f35a95aa710 (LWP 13410)]
0x7f35a32efb9f in poll () from /lib/libc.so.6
(gdb) bt
#0  0x7f35a32efb9f in poll () from /lib/libc.so.6
#1  0x7f35a51b7296 in ?? () from /usr/lib/libcurl.so.4
#2  0x7f35a51afbc1 in ?? () from /usr/lib/libcurl.so.4
#3  0x7f35a51afefd in ?? () from /usr/lib/libcurl.so.4
#4  0x7f35a519ea3b in ?? () from /usr/lib/libcurl.so.4
#5  0x7f35a51ac80b in ?? () from /usr/lib/libcurl.so.4
#6  0x7f35a8ca7f5c in VSICurlFilesystemHandler::GetFileList (
 this=0x17b0310,
 pszDirname=0x17b28e8 /vsicurl/http://example.com/ws;,
 pbGotFileList=0x17b2850) at cpl_vsil_curl.cpp:2314
#7  0x7f35a8cab19b in VSICurlFilesystemHandler::ReadDir
(this=0x17b0310,
 pszDirname=value optimized out, pbGotFileList=0x7fff120d38f8)
 at cpl_vsil_curl.cpp:2461
#8  0x7f35a8caa88b in VSICurlFilesystemHandler::Stat (this=0x17b0310,
 pszFilename=0x17b2730 /vsicurl/http://example.com/ws/test-pg.php;,
 pStatBuf=value optimized out, nFlags=3) at cpl_vsil_curl.cpp:2367
#9  0x7f35a8cb1c39 in VSIStatExL (
 pszFilename=0x17b2730 /vsicurl/http://example.com/ws/test-pg.php;,
 psStatBuf=0x7fff120d39b0, nFlags=3) at cpl_vsil.cpp:285
#10 0x7f35a8e73f44 in OGRShapeDataSource::Open (this=0x17b27a0,
 pszNewName=0x17b2730 /vsicurl/http://example.com/ws/test-pg.php;,
 bUpdate=0, bTestOpen=1, bForceSingleFileDataSource=0)
 at ogrshapedatasource.cpp:109
#11 0x7f35a8e74e82 in OGRShapeDriver::Open (this=value optimized out,
 pszFilename=0x17b2730 /vsicurl/http://example.com/ws/test-pg.php;,
 bUpdate=0) at ogrshapedriver.cpp:70
#12 0x7f35a8e7272c in OGRSFDriverRegistrar::Open (
 pszName=0x17b2730 /vsicurl/http://example.com/ws/test-pg.php;,
 bUpdate=0, ppoDriver=0x0) at ogrsfdriverregistrar.cpp:226
#13 0x7f35a8e966f7 in OGRVRTLayer::FullInitialize (this=0x17b1530)
 at ogrvrtlayer.cpp:322
#14 0x7f35a8e97450 in OGRVRTLayer::GetLayerDefn (this=0x17b1530)
 at ogrvrtlayer.cpp:1864
#15 0x0040195d in ReportOnLayer (poLayer=0x7fff120d2f50,
 pszWHERE=0x1 Address 0x1 out of bounds, poSpatialFilter=0x493c0)
 at ogrinfo.cpp:360
#16 0x004023f6 in main (nArgc=value optimized out,
 papszArgv=0x17b0ae0) at ogrinfo.cpp:316
(gdb) c
Continuing.

# after a very long wait 3-5 minutes it times out with the following

ERROR 1: Failed to open datasource
`/vsicurl/http://example.com/ws/test-pg.php'.

Layer name: test
Geometry: 3D Point
Feature Count: 0
Layer SRS WKT:
GEOGCS[WGS 84,
 DATUM[WGS_1984,
 SPHEROID[WGS 84,6378137,298.257223563,
 AUTHORITY[EPSG,7030]],
 TOWGS84[0,0,0,0,0,0,0],
 AUTHORITY[EPSG,6326]],
 PRIMEM[Greenwich,0,
 AUTHORITY[EPSG,8901]],
 UNIT[degree,0.0174532925199433,
 AUTHORITY[EPSG,9108]],
 AUTHORITY[EPSG,4326]]

Program exited normally.

OK, got 1.10beta1 compiled I'll give that a try. This might just be my
very old version of curl 7.18.2


Even,

Running with 1.10beta1 and the (hopefully fixed) test-pg.php

I am getting similar behavior, so maybe my curl is just broken.

woodbri@mappy:~/work/mongo/ogr$ ogrinfo -ro test-pg.vrt test
INFO: Open of `test-pg.vrt'
  using driver `VRT' successful.
ERROR 1: Failed to open

  1   2   >