Re: [JPP-Devel] OJ spoils shapefile if shema is edited

2013-03-01 Thread Rahkonen Jukka
Hi,

This one looks bad. I have tested only with r3277 but not with any older 
versions.  I can reproduce this:

- Start OJ
- Create a layer, add attribute attr of type DOUBLE
- Make a point, set attr=1.0
- Save as a shapefile
- Delete layer, read in the saved shapefile, everything OK
- Edit schema, change attr into type INTEGER
- Do Save selected dataset
- Shapefile is now corrupted

OJ cannot open this saved shapefile. The error is

java.io.EOFException
at java.io.DataInputStream.readFully(Unknown Source)
at java.io.DataInputStream.readFully(Unknown Source)
at 
com.vividsolutions.jump.io.EndianDataInputStream.readByteLEnum(EndianDataInputStream.java
:75)
at org.geotools.dbffile.DbfFile.GetDbfRec(DbfFile.java:230)
at 
com.vividsolutions.jump.io.ShapefileReader.read(ShapefileReader.java:179)
at 
com.vividsolutions.jump.io.datasource.DelegatingCompressedFileHandler.read(DelegatingComp
ressedFileHandler.java:80)
at 
com.vividsolutions.jump.io.datasource.ReaderWriterFileDataSource$1.executeQuery(ReaderWri
terFileDataSource.java:61)
at 
org.openjump.core.ui.io.file.DataSourceFileLayerLoader.open(DataSourceFileLayerLoader.jav
a:107)
at 
org.openjump.core.ui.plugin.file.open.OpenFileWizard.run(OpenFileWizard.java:131)
at 
org.openjump.core.ui.plugin.AbstractWizardPlugin.run(AbstractWizardPlugin.java:73)
at 
com.vividsolutions.jump.workbench.ui.task.TaskMonitorManager$TaskWrapper.run(TaskMonitorM
anager.java:152)
at java.lang.Thread.run(Unknown Source)

GDAL cannot open this shapefile either. It suggests that there is something 
wrong with the .dbf file.  From ogrinfo:
Layer name: spoil
Geometry: Point
Feature Count: 1
Extent: (280.00, 127.00) - (280.00, 127.00)
Layer SRS WKT:
(unknown)
attr: Real (33.16)
ERROR 1: fread(34) failed on DBF file.

-Jukka Rahkonen-

--
Everyone hates slow websites. So do we.
Make your web apps faster with AppDynamics
Download AppDynamics Lite for free today:
http://p.sf.net/sfu/appdyn_d2d_feb
___
Jump-pilot-devel mailing list
Jump-pilot-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/jump-pilot-devel


Re: [JPP-Devel] OJ spoils shapefile if shema is edited

2013-03-01 Thread Michaël Michaud
Hi Jukka,

Thanks,
Can reproduce it. I'll put it on my priority list.

Michaël


 Hi,

 I have been trying to mail this to dev list since yesterday but there seem to 
 be some problems.
 I spoiled one dataset this way but fortunately I found a copy and just loosed 
 some time I spent for edits.

 -Jukka-

 -Alkuperäinen viesti-
 Lähettäjä: Rahkonen Jukka
 Lähetetty: 1. maaliskuuta 2013 7:51
 Vastaanottaja: 'OpenJump develop and use 
 (jump-pilot-devel@lists.sourceforge.net)'
 Aihe: VS: OJ spoils shapefile if shema is edited

 Hi,

 This one looks bad. I have tested only with r3277 but not with any older 
 versions.  I can reproduce this:

 - Start OJ
 - Create a layer, add attribute attr of type DOUBLE
 - Make a point, set attr=1.0
 - Save as a shapefile
 - Delete layer, read in the saved shapefile, everything OK
 - Edit schema, change attr into type INTEGER
 - Do Save selected dataset
 - Shapefile is now corrupted

 OJ cannot open this saved shapefile. The error is

 java.io.EOFException
  at java.io.DataInputStream.readFully(Unknown Source)
  at java.io.DataInputStream.readFully(Unknown Source)
  at 
 com.vividsolutions.jump.io.EndianDataInputStream.readByteLEnum(EndianDataInputStream.java
 :75)
  at org.geotools.dbffile.DbfFile.GetDbfRec(DbfFile.java:230)
  at 
 com.vividsolutions.jump.io.ShapefileReader.read(ShapefileReader.java:179)
  at 
 com.vividsolutions.jump.io.datasource.DelegatingCompressedFileHandler.read(DelegatingComp
 ressedFileHandler.java:80)
  at 
 com.vividsolutions.jump.io.datasource.ReaderWriterFileDataSource$1.executeQuery(ReaderWri
 terFileDataSource.java:61)
  at 
 org.openjump.core.ui.io.file.DataSourceFileLayerLoader.open(DataSourceFileLayerLoader.jav
 a:107)
  at 
 org.openjump.core.ui.plugin.file.open.OpenFileWizard.run(OpenFileWizard.java:131)
  at 
 org.openjump.core.ui.plugin.AbstractWizardPlugin.run(AbstractWizardPlugin.java:73)
  at 
 com.vividsolutions.jump.workbench.ui.task.TaskMonitorManager$TaskWrapper.run(TaskMonitorM
 anager.java:152)
  at java.lang.Thread.run(Unknown Source)

 GDAL cannot open this shapefile either. It suggests that there is something 
 wrong with the .dbf file.  From ogrinfo:
 Layer name: spoil
 Geometry: Point
 Feature Count: 1
 Extent: (280.00, 127.00) - (280.00, 127.00) Layer SRS WKT:
 (unknown)
 attr: Real (33.16)
 ERROR 1: fread(34) failed on DBF file.

 -Jukka Rahkonen-




--
Everyone hates slow websites. So do we.
Make your web apps faster with AppDynamics
Download AppDynamics Lite for free today:
http://p.sf.net/sfu/appdyn_d2d_feb
___
Jump-pilot-devel mailing list
Jump-pilot-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/jump-pilot-devel


[JPP-Devel] OJ spoils shapefile if shema is edited

2013-03-01 Thread Rahkonen Jukka
Hi,

This is bad. I have tested only with r3277 but not with any older versions.  I 
can reproduce this:

- Start OJ
- Create a layer, add attribute attr of type DOUBLE
- Make a point, set attr=1.0
- Save as a shapefile 
- Delete layer, read in the saved shapefile, everything OK
- Edit schema, change attr into type INTEGER
- Do Save selected dataset
- Shapefile is now corrupted

OJ cannot open this saved shapefile. The error is

java.io.EOFException
at java.io.DataInputStream.readFully(Unknown Source)
at java.io.DataInputStream.readFully(Unknown Source)
at 
com.vividsolutions.jump.io.EndianDataInputStream.readByteLEnum(EndianDataInputStream.java
:75)
at org.geotools.dbffile.DbfFile.GetDbfRec(DbfFile.java:230)
at 
com.vividsolutions.jump.io.ShapefileReader.read(ShapefileReader.java:179)
at 
com.vividsolutions.jump.io.datasource.DelegatingCompressedFileHandler.read(DelegatingComp
ressedFileHandler.java:80)
at 
com.vividsolutions.jump.io.datasource.ReaderWriterFileDataSource$1.executeQuery(ReaderWri
terFileDataSource.java:61)
at 
org.openjump.core.ui.io.file.DataSourceFileLayerLoader.open(DataSourceFileLayerLoader.jav
a:107)
at 
org.openjump.core.ui.plugin.file.open.OpenFileWizard.run(OpenFileWizard.java:131)
at 
org.openjump.core.ui.plugin.AbstractWizardPlugin.run(AbstractWizardPlugin.java:73)
at 
com.vividsolutions.jump.workbench.ui.task.TaskMonitorManager$TaskWrapper.run(TaskMonitorM
anager.java:152)
at java.lang.Thread.run(Unknown Source)

GDAL cannot open this shapefile either. It suggests that there is something 
wrong with the .dbf file.  From ogrinfo:
Layer name: spoil
Geometry: Point
Feature Count: 1
Extent: (280.00, 127.00) - (280.00, 127.00)
Layer SRS WKT:
(unknown)
attr: Real (33.16)
ERROR 1: fread(34) failed on DBF file.

-Jukka Rahkonen-

--
Everyone hates slow websites. So do we.
Make your web apps faster with AppDynamics
Download AppDynamics Lite for free today:
http://p.sf.net/sfu/appdyn_d2d_feb
___
Jump-pilot-devel mailing list
Jump-pilot-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/jump-pilot-devel


Re: [JPP-Devel] OJ spoils shapefile if shema is edited

2013-03-01 Thread Larry Becker
A more general solution than my last post is:

if (columnType == AttributeType.INTEGER) {
fields[f] = new DbfFieldDef(columnName, 'N', 11, 0);
//LDB: previously 16
DbfFieldDef fromFile =
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
if (fromFile.fieldnumdec == 0)
fields[f] = fromFile;
f++;
} else if (columnType == AttributeType.DOUBLE) {
fields[f] = new DbfFieldDef(columnName, 'N', 33, 16);
DbfFieldDef fromFile =
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
if (fromFile.fieldnumdec  0)
fields[f] = fromFile;
   f++;

Larry

On Fri, Mar 1, 2013 at 10:38 AM, Larry Becker becker.la...@gmail.comwrote:

 I have a fix that works in SkyJUMP and should work in OpenJUMP.  It isn't
 elegant but it works by avoiding calling
 overrideWithExistingCompatibleDbfFieldDef when a type change is detected.
 There may be a more direct solution, but I didn't find one.

 The patch file for SkyJUMP is:

 ### Eclipse Workspace Patch 1.0
 #P SkyJumpSVN
 Index: com/vividsolutions/jump/io/ShapefileWriter.java
 ===
 --- com/vividsolutions/jump/io/ShapefileWriter.java(revision 2)
 +++ com/vividsolutions/jump/io/ShapefileWriter.java(working copy)
 @@ -407,11 +407,13 @@

  if (columnType == AttributeType.INTEGER) {
  fields[f] = new DbfFieldDef(columnName, 'N', 11, 0);
 //LDB: previously 16
 -fields[f] =
 overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
 +if ((fieldMap != null) 
 !fieldMap.toString().endsWith(N 33.16}))
 +fields[f] =
 overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
  f++;
  } else if (columnType == AttributeType.DOUBLE) {
  fields[f] = new DbfFieldDef(columnName, 'N', 33, 16);
 -fields[f] =
 overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
 +if ((fieldMap != null) 
 !fieldMap.toString().endsWith(N 11.0}))
 +fields[f] =
 overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
  f++;
  } else if (columnType == AttributeType.STRING) {
  int maxlength = findMaxStringLength(featureCollection, t);

 Larry


 On Fri, Mar 1, 2013 at 9:09 AM, Larry Becker becker.la...@gmail.comwrote:

 I can reproduce it in SkyJUMP.

 I think this is thte result of using
 overrideWithExistingCompatibleDbfFieldDef in ShapeWriter.   My rational
 behind this method introduced a few years ago (July 2010) is that the size
 of the numbers entered during an edit session should not automatically lead
 to a reduction of precision in the shapefile.  This was the case before
 this procedure was introduced. I had shapefiles for which there were
 published data standards, but SkyJUMP would change the number of digits
 each time I saved the file.   The change achieved its goal, but it may not
 work correctly for this particular case due to the type change.

 I agree with Jukka.  It may be an unusual case, but the reliability of
 shapefile IO is crucial to the JUMP family.  This must be fixed.  I will
 work on it too.

 BTW, the result is the same if you Save Dataset As instead of Save
 Selected Datasets.

 Larry


 On Thu, Feb 28, 2013 at 7:00 AM, Rahkonen Jukka 
 jukka.rahko...@mmmtike.fi wrote:

 Hi,

 This is bad. I have tested only with r3277 but not with any older
 versions.  I can reproduce this:

 - Start OJ
 - Create a layer, add attribute attr of type DOUBLE
 - Make a point, set attr=1.0
 - Save as a shapefile
 - Delete layer, read in the saved shapefile, everything OK
 - Edit schema, change attr into type INTEGER
 - Do Save selected dataset
 - Shapefile is now corrupted

 OJ cannot open this saved shapefile. The error is

 java.io.EOFException
 at java.io.DataInputStream.readFully(Unknown Source)
 at java.io.DataInputStream.readFully(Unknown Source)
 at
 com.vividsolutions.jump.io.EndianDataInputStream.readByteLEnum(EndianDataInputStream.java
 :75)
 at org.geotools.dbffile.DbfFile.GetDbfRec(DbfFile.java:230)
 at
 com.vividsolutions.jump.io.ShapefileReader.read(ShapefileReader.java:179)
 at
 com.vividsolutions.jump.io.datasource.DelegatingCompressedFileHandler.read(DelegatingComp
 ressedFileHandler.java:80)
 at
 com.vividsolutions.jump.io.datasource.ReaderWriterFileDataSource$1.executeQuery(ReaderWri
 terFileDataSource.java:61)
 at
 org.openjump.core.ui.io.file.DataSourceFileLayerLoader.open(DataSourceFileLayerLoader.jav
 a:107)
 at
 org.openjump.core.ui.plugin.file.open.OpenFileWizard.run(OpenFileWizard.java:131)
 at
 org.openjump.core.ui.plugin.AbstractWizardPlugin.run(AbstractWizardPlugin.java:73)
 at
 

Re: [JPP-Devel] OJ spoils shapefile if shema is edited

2013-03-01 Thread Larry Becker
I can reproduce it in SkyJUMP.

I think this is thte result of using
overrideWithExistingCompatibleDbfFieldDef in ShapeWriter.   My rational
behind this method introduced a few years ago (July 2010) is that the size
of the numbers entered during an edit session should not automatically lead
to a reduction of precision in the shapefile.  This was the case before
this procedure was introduced. I had shapefiles for which there were
published data standards, but SkyJUMP would change the number of digits
each time I saved the file.   The change achieved its goal, but it may not
work correctly for this particular case due to the type change.

I agree with Jukka.  It may be an unusual case, but the reliability of
shapefile IO is crucial to the JUMP family.  This must be fixed.  I will
work on it too.

BTW, the result is the same if you Save Dataset As instead of Save Selected
Datasets.

Larry

On Thu, Feb 28, 2013 at 7:00 AM, Rahkonen Jukka
jukka.rahko...@mmmtike.fiwrote:

 Hi,

 This is bad. I have tested only with r3277 but not with any older
 versions.  I can reproduce this:

 - Start OJ
 - Create a layer, add attribute attr of type DOUBLE
 - Make a point, set attr=1.0
 - Save as a shapefile
 - Delete layer, read in the saved shapefile, everything OK
 - Edit schema, change attr into type INTEGER
 - Do Save selected dataset
 - Shapefile is now corrupted

 OJ cannot open this saved shapefile. The error is

 java.io.EOFException
 at java.io.DataInputStream.readFully(Unknown Source)
 at java.io.DataInputStream.readFully(Unknown Source)
 at
 com.vividsolutions.jump.io.EndianDataInputStream.readByteLEnum(EndianDataInputStream.java
 :75)
 at org.geotools.dbffile.DbfFile.GetDbfRec(DbfFile.java:230)
 at
 com.vividsolutions.jump.io.ShapefileReader.read(ShapefileReader.java:179)
 at
 com.vividsolutions.jump.io.datasource.DelegatingCompressedFileHandler.read(DelegatingComp
 ressedFileHandler.java:80)
 at
 com.vividsolutions.jump.io.datasource.ReaderWriterFileDataSource$1.executeQuery(ReaderWri
 terFileDataSource.java:61)
 at
 org.openjump.core.ui.io.file.DataSourceFileLayerLoader.open(DataSourceFileLayerLoader.jav
 a:107)
 at
 org.openjump.core.ui.plugin.file.open.OpenFileWizard.run(OpenFileWizard.java:131)
 at
 org.openjump.core.ui.plugin.AbstractWizardPlugin.run(AbstractWizardPlugin.java:73)
 at
 com.vividsolutions.jump.workbench.ui.task.TaskMonitorManager$TaskWrapper.run(TaskMonitorM
 anager.java:152)
 at java.lang.Thread.run(Unknown Source)

 GDAL cannot open this shapefile either. It suggests that there is
 something wrong with the .dbf file.  From ogrinfo:
 Layer name: spoil
 Geometry: Point
 Feature Count: 1
 Extent: (280.00, 127.00) - (280.00, 127.00)
 Layer SRS WKT:
 (unknown)
 attr: Real (33.16)
 ERROR 1: fread(34) failed on DBF file.

 -Jukka Rahkonen-


 --
 Everyone hates slow websites. So do we.
 Make your web apps faster with AppDynamics
 Download AppDynamics Lite for free today:
 http://p.sf.net/sfu/appdyn_d2d_feb
 ___
 Jump-pilot-devel mailing list
 Jump-pilot-devel@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/jump-pilot-devel

--
Everyone hates slow websites. So do we.
Make your web apps faster with AppDynamics
Download AppDynamics Lite for free today:
http://p.sf.net/sfu/appdyn_d2d_feb___
Jump-pilot-devel mailing list
Jump-pilot-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/jump-pilot-devel


Re: [JPP-Devel] OJ spoils shapefile if shema is edited

2013-03-01 Thread Larry Becker
I have a fix that works in SkyJUMP and should work in OpenJUMP.  It isn't
elegant but it works by avoiding calling
overrideWithExistingCompatibleDbfFieldDef when a type change is detected.
There may be a more direct solution, but I didn't find one.

The patch file for SkyJUMP is:

### Eclipse Workspace Patch 1.0
#P SkyJumpSVN
Index: com/vividsolutions/jump/io/ShapefileWriter.java
===
--- com/vividsolutions/jump/io/ShapefileWriter.java(revision 2)
+++ com/vividsolutions/jump/io/ShapefileWriter.java(working copy)
@@ -407,11 +407,13 @@

 if (columnType == AttributeType.INTEGER) {
 fields[f] = new DbfFieldDef(columnName, 'N', 11, 0);
//LDB: previously 16
-fields[f] =
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
+if ((fieldMap != null)  !fieldMap.toString().endsWith(N
33.16}))
+fields[f] =
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
 f++;
 } else if (columnType == AttributeType.DOUBLE) {
 fields[f] = new DbfFieldDef(columnName, 'N', 33, 16);
-fields[f] =
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
+if ((fieldMap != null)  !fieldMap.toString().endsWith(N
11.0}))
+fields[f] =
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
 f++;
 } else if (columnType == AttributeType.STRING) {
 int maxlength = findMaxStringLength(featureCollection, t);

Larry

On Fri, Mar 1, 2013 at 9:09 AM, Larry Becker becker.la...@gmail.com wrote:

 I can reproduce it in SkyJUMP.

 I think this is thte result of using
 overrideWithExistingCompatibleDbfFieldDef in ShapeWriter.   My rational
 behind this method introduced a few years ago (July 2010) is that the size
 of the numbers entered during an edit session should not automatically lead
 to a reduction of precision in the shapefile.  This was the case before
 this procedure was introduced. I had shapefiles for which there were
 published data standards, but SkyJUMP would change the number of digits
 each time I saved the file.   The change achieved its goal, but it may not
 work correctly for this particular case due to the type change.

 I agree with Jukka.  It may be an unusual case, but the reliability of
 shapefile IO is crucial to the JUMP family.  This must be fixed.  I will
 work on it too.

 BTW, the result is the same if you Save Dataset As instead of Save
 Selected Datasets.

 Larry


 On Thu, Feb 28, 2013 at 7:00 AM, Rahkonen Jukka jukka.rahko...@mmmtike.fi
  wrote:

 Hi,

 This is bad. I have tested only with r3277 but not with any older
 versions.  I can reproduce this:

 - Start OJ
 - Create a layer, add attribute attr of type DOUBLE
 - Make a point, set attr=1.0
 - Save as a shapefile
 - Delete layer, read in the saved shapefile, everything OK
 - Edit schema, change attr into type INTEGER
 - Do Save selected dataset
 - Shapefile is now corrupted

 OJ cannot open this saved shapefile. The error is

 java.io.EOFException
 at java.io.DataInputStream.readFully(Unknown Source)
 at java.io.DataInputStream.readFully(Unknown Source)
 at
 com.vividsolutions.jump.io.EndianDataInputStream.readByteLEnum(EndianDataInputStream.java
 :75)
 at org.geotools.dbffile.DbfFile.GetDbfRec(DbfFile.java:230)
 at
 com.vividsolutions.jump.io.ShapefileReader.read(ShapefileReader.java:179)
 at
 com.vividsolutions.jump.io.datasource.DelegatingCompressedFileHandler.read(DelegatingComp
 ressedFileHandler.java:80)
 at
 com.vividsolutions.jump.io.datasource.ReaderWriterFileDataSource$1.executeQuery(ReaderWri
 terFileDataSource.java:61)
 at
 org.openjump.core.ui.io.file.DataSourceFileLayerLoader.open(DataSourceFileLayerLoader.jav
 a:107)
 at
 org.openjump.core.ui.plugin.file.open.OpenFileWizard.run(OpenFileWizard.java:131)
 at
 org.openjump.core.ui.plugin.AbstractWizardPlugin.run(AbstractWizardPlugin.java:73)
 at
 com.vividsolutions.jump.workbench.ui.task.TaskMonitorManager$TaskWrapper.run(TaskMonitorM
 anager.java:152)
 at java.lang.Thread.run(Unknown Source)

 GDAL cannot open this shapefile either. It suggests that there is
 something wrong with the .dbf file.  From ogrinfo:
 Layer name: spoil
 Geometry: Point
 Feature Count: 1
 Extent: (280.00, 127.00) - (280.00, 127.00)
 Layer SRS WKT:
 (unknown)
 attr: Real (33.16)
 ERROR 1: fread(34) failed on DBF file.

 -Jukka Rahkonen-


 --
 Everyone hates slow websites. So do we.
 Make your web apps faster with AppDynamics
 Download AppDynamics Lite for free today:
 http://p.sf.net/sfu/appdyn_d2d_feb
 ___
 Jump-pilot-devel mailing list
 Jump-pilot-devel@lists.sourceforge.net
 

Re: [JPP-Devel] OJ spoils shapefile if shema is edited

2013-03-01 Thread Michaël Michaud

Hi Larry,

Thanks a lot for your patch.
It took me time to understand how it works.

I noticed that the overrideWithExistingCompatibleDbfFieldDef is used
for AttributeType.INTEGER and AttributeType.DOUBLE
but for DOUBLE, I could not find a case where the overriding method
can change the dbf field.
It seems to me that currently, the original number of decimal of a double
will never be preserved (i mean in OpenJUMP code) ?
What do you think ?

Michaël


A more general solution than my last post is:

if (columnType == AttributeType.INTEGER) {
fields[f] = new DbfFieldDef(columnName, 'N', 11, 0);  
//LDB: previously 16
DbfFieldDef fromFile = 
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);

if (fromFile.fieldnumdec == 0)
fields[f] = fromFile;
f++;
} else if (columnType == AttributeType.DOUBLE) {
fields[f] = new DbfFieldDef(columnName, 'N', 33, 16);
DbfFieldDef fromFile = 
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);

if (fromFile.fieldnumdec  0)
fields[f] = fromFile;
   f++;

Larry

On Fri, Mar 1, 2013 at 10:38 AM, Larry Becker becker.la...@gmail.com 
mailto:becker.la...@gmail.com wrote:


I have a fix that works in SkyJUMP and should work in OpenJUMP. 
It isn't elegant but it works by avoiding calling

overrideWithExistingCompatibleDbfFieldDef when a type change is
detected.   There may be a more direct solution, but I didn't find
one.

The patch file for SkyJUMP is:

### Eclipse Workspace Patch 1.0
#P SkyJumpSVN
Index: com/vividsolutions/jump/io/ShapefileWriter.java
===
--- com/vividsolutions/jump/io/ShapefileWriter.java (revision 2)
+++ com/vividsolutions/jump/io/ShapefileWriter.java (working copy)
@@ -407,11 +407,13 @@

 if (columnType == AttributeType.INTEGER) {
 fields[f] = new DbfFieldDef(columnName, 'N', 11,
0);  //LDB: previously 16
-fields[f] =
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
+if ((fieldMap != null) 
!fieldMap.toString().endsWith(N 33.16}))
+fields[f] =
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
 f++;
 } else if (columnType == AttributeType.DOUBLE) {
 fields[f] = new DbfFieldDef(columnName, 'N', 33, 16);
-fields[f] =
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
+if ((fieldMap != null) 
!fieldMap.toString().endsWith(N 11.0}))
+fields[f] =
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
 f++;
 } else if (columnType == AttributeType.STRING) {
 int maxlength =
findMaxStringLength(featureCollection, t);

Larry


On Fri, Mar 1, 2013 at 9:09 AM, Larry Becker
becker.la...@gmail.com mailto:becker.la...@gmail.com wrote:

I can reproduce it in SkyJUMP.

I think this is thte result of using
overrideWithExistingCompatibleDbfFieldDef in ShapeWriter.   My
rational behind this method introduced a few years ago (July
2010) is that the size of the numbers entered during an edit
session should not automatically lead to a reduction of
precision in the shapefile.  This was the case before this
procedure was introduced. I had shapefiles for which there
were published data standards, but SkyJUMP would change the
number of digits each time I saved the file.   The change
achieved its goal, but it may not work correctly for this
particular case due to the type change.

I agree with Jukka.  It may be an unusual case, but the
reliability of shapefile IO is crucial to the JUMP family. 
This must be fixed.  I will work on it too.


BTW, the result is the same if you Save Dataset As instead of
Save Selected Datasets.

Larry


On Thu, Feb 28, 2013 at 7:00 AM, Rahkonen Jukka
jukka.rahko...@mmmtike.fi mailto:jukka.rahko...@mmmtike.fi
wrote:

Hi,

This is bad. I have tested only with r3277 but not with
any older versions.  I can reproduce this:

- Start OJ
- Create a layer, add attribute attr of type DOUBLE
- Make a point, set attr=1.0
- Save as a shapefile
- Delete layer, read in the saved shapefile, everything OK
- Edit schema, change attr into type INTEGER
- Do Save selected dataset
- Shapefile is now corrupted

OJ cannot open this saved shapefile. The error is


Re: [JPP-Devel] OJ spoils shapefile if shema is edited

2013-03-01 Thread Larry Becker
Hi Michaël,

  I'm not sure what you are seeing, but I tested both Sky and OJ before the
mod, and they both preserved an N 16.5 format when saving selected datasets.

regards,

Larry

On Fri, Mar 1, 2013 at 3:41 PM, Michaël Michaud michael.mich...@free.frwrote:

  Hi Larry,

 Thanks a lot for your patch.
 It took me time to understand how it works.

 I noticed that the overrideWithExistingCompatibleDbfFieldDef is used
 for AttributeType.INTEGER and AttributeType.DOUBLE
 but for DOUBLE, I could not find a case where the overriding method
 can change the dbf field.
 It seems to me that currently, the original number of decimal of a double
 will never be preserved (i mean in OpenJUMP code) ?
 What do you think ?

 Michaël

  A more general solution than my last post is:

 if (columnType == AttributeType.INTEGER) {
 fields[f] = new DbfFieldDef(columnName, 'N', 11, 0);
 //LDB: previously 16
 DbfFieldDef fromFile =
 overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
 if (fromFile.fieldnumdec == 0)
 fields[f] = fromFile;
 f++;
 } else if (columnType == AttributeType.DOUBLE) {
 fields[f] = new DbfFieldDef(columnName, 'N', 33, 16);
 DbfFieldDef fromFile =
 overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
 if (fromFile.fieldnumdec  0)
 fields[f] = fromFile;
f++;

 Larry

 On Fri, Mar 1, 2013 at 10:38 AM, Larry Becker becker.la...@gmail.comwrote:

 I have a fix that works in SkyJUMP and should work in OpenJUMP.  It isn't
 elegant but it works by avoiding calling
 overrideWithExistingCompatibleDbfFieldDef when a type change is detected.
 There may be a more direct solution, but I didn't find one.

 The patch file for SkyJUMP is:

 ### Eclipse Workspace Patch 1.0
 #P SkyJumpSVN
 Index: com/vividsolutions/jump/io/ShapefileWriter.java
 ===
 --- com/vividsolutions/jump/io/ShapefileWriter.java(revision 2)
 +++ com/vividsolutions/jump/io/ShapefileWriter.java(working copy)
 @@ -407,11 +407,13 @@

  if (columnType == AttributeType.INTEGER) {
  fields[f] = new DbfFieldDef(columnName, 'N', 11, 0);
 //LDB: previously 16
 -fields[f] =
 overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
 +if ((fieldMap != null) 
 !fieldMap.toString().endsWith(N 33.16}))
 +fields[f] =
 overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
  f++;
  } else if (columnType == AttributeType.DOUBLE) {
  fields[f] = new DbfFieldDef(columnName, 'N', 33, 16);
 -fields[f] =
 overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
 +if ((fieldMap != null) 
 !fieldMap.toString().endsWith(N 11.0}))
 +fields[f] =
 overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
  f++;
  } else if (columnType == AttributeType.STRING) {
  int maxlength = findMaxStringLength(featureCollection,
 t);

 Larry


 On Fri, Mar 1, 2013 at 9:09 AM, Larry Becker becker.la...@gmail.comwrote:

 I can reproduce it in SkyJUMP.

 I think this is thte result of using
 overrideWithExistingCompatibleDbfFieldDef in ShapeWriter.   My rational
 behind this method introduced a few years ago (July 2010) is that the size
 of the numbers entered during an edit session should not automatically lead
 to a reduction of precision in the shapefile.  This was the case before
 this procedure was introduced. I had shapefiles for which there were
 published data standards, but SkyJUMP would change the number of digits
 each time I saved the file.   The change achieved its goal, but it may not
 work correctly for this particular case due to the type change.

 I agree with Jukka.  It may be an unusual case, but the reliability of
 shapefile IO is crucial to the JUMP family.  This must be fixed.  I will
 work on it too.

 BTW, the result is the same if you Save Dataset As instead of Save
 Selected Datasets.

 Larry


 On Thu, Feb 28, 2013 at 7:00 AM, Rahkonen Jukka 
 jukka.rahko...@mmmtike.fi wrote:

 Hi,

 This is bad. I have tested only with r3277 but not with any older
 versions.  I can reproduce this:

 - Start OJ
 - Create a layer, add attribute attr of type DOUBLE
 - Make a point, set attr=1.0
 - Save as a shapefile
 - Delete layer, read in the saved shapefile, everything OK
 - Edit schema, change attr into type INTEGER
 - Do Save selected dataset
 - Shapefile is now corrupted

 OJ cannot open this saved shapefile. The error is

 java.io.EOFException
 at java.io.DataInputStream.readFully(Unknown Source)
 at java.io.DataInputStream.readFully(Unknown Source)
 at
 

Re: [JPP-Devel] OJ spoils shapefile if shema is edited

2013-03-01 Thread Michaël Michaud

Hi Larry,

My mistake,
I thought that Integers were encoded into 'N' and Double into 'F'
But 'F' are just read by OpenJUMP, not written.

Sorry for the noise,

Michaël


Hi Michaël,

  I'm not sure what you are seeing, but I tested both Sky and OJ 
before the mod, and they both preserved an N 16.5 format when saving 
selected datasets.


regards,

Larry

On Fri, Mar 1, 2013 at 3:41 PM, Michaël Michaud 
michael.mich...@free.fr mailto:michael.mich...@free.fr wrote:


Hi Larry,

Thanks a lot for your patch.
It took me time to understand how it works.

I noticed that the overrideWithExistingCompatibleDbfFieldDef is used
for AttributeType.INTEGER and AttributeType.DOUBLE
but for DOUBLE, I could not find a case where the overriding method
can change the dbf field.
It seems to me that currently, the original number of decimal of a
double
will never be preserved (i mean in OpenJUMP code) ?
What do you think ?

Michaël


A more general solution than my last post is:

if (columnType == AttributeType.INTEGER) {
fields[f] = new DbfFieldDef(columnName, 'N', 11,
0);  //LDB: previously 16
DbfFieldDef fromFile =
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
if (fromFile.fieldnumdec == 0)
fields[f] = fromFile;
f++;
} else if (columnType == AttributeType.DOUBLE) {
fields[f] = new DbfFieldDef(columnName, 'N', 33, 16);
DbfFieldDef fromFile =
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
if (fromFile.fieldnumdec  0)
fields[f] = fromFile;
   f++;

Larry

On Fri, Mar 1, 2013 at 10:38 AM, Larry Becker
becker.la...@gmail.com mailto:becker.la...@gmail.com wrote:

I have a fix that works in SkyJUMP and should work in
OpenJUMP.  It isn't elegant but it works by avoiding calling
overrideWithExistingCompatibleDbfFieldDef when a type change
is detected.   There may be a more direct solution, but I
didn't find one.

The patch file for SkyJUMP is:

### Eclipse Workspace Patch 1.0
#P SkyJumpSVN
Index: com/vividsolutions/jump/io/ShapefileWriter.java
===
--- com/vividsolutions/jump/io/ShapefileWriter.java (revision 2)
+++ com/vividsolutions/jump/io/ShapefileWriter.java (working
copy)
@@ -407,11 +407,13 @@

 if (columnType == AttributeType.INTEGER) {
 fields[f] = new DbfFieldDef(columnName, 'N',
11, 0);  //LDB: previously 16
-fields[f] =
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
+if ((fieldMap != null) 
!fieldMap.toString().endsWith(N 33.16}))
+fields[f] =
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
 f++;
 } else if (columnType == AttributeType.DOUBLE) {
 fields[f] = new DbfFieldDef(columnName, 'N',
33, 16);
-fields[f] =
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
+if ((fieldMap != null) 
!fieldMap.toString().endsWith(N 11.0}))
+fields[f] =
overrideWithExistingCompatibleDbfFieldDef(fields[f], fieldMap);
 f++;
 } else if (columnType == AttributeType.STRING) {
 int maxlength =
findMaxStringLength(featureCollection, t);

Larry


On Fri, Mar 1, 2013 at 9:09 AM, Larry Becker
becker.la...@gmail.com mailto:becker.la...@gmail.com wrote:

I can reproduce it in SkyJUMP.

I think this is thte result of using
overrideWithExistingCompatibleDbfFieldDef in
ShapeWriter.   My rational behind this method introduced
a few years ago (July 2010) is that the size of the
numbers entered during an edit session should not
automatically lead to a reduction of precision in the
shapefile.  This was the case before this procedure was
introduced. I had shapefiles for which there were
published data standards, but SkyJUMP would change the
number of digits each time I saved the file.   The change
achieved its goal, but it may not work correctly for this
particular case due to the type change.

I agree with Jukka.  It may be an unusual case, but the
reliability of shapefile IO is crucial to the JUMP
family.  This must be fixed.  I will work on it too.

BTW, the result is the