I wrote the windows app.
I export all data to simple ASCII text where fields are delimited with a tab 
and then run the file through a UTF8 converter (convertcp_v8.3_x86).

I will try the entire process on a Xeon E5-1620 and let it run during the night 
to see what happens. But the current i9 machine is a machine from only 4 years 
ago which should have no issues.

On Sunday, April 14th, 2024 at 8:50 PM, Adrian Klaver 
<adrian.kla...@aklaver.com> wrote:

> 
> 
> On 4/14/24 14:50, jack wrote:
> 
> Reply to list also
> Ccing list
> 
> > Hello,
> > I am not sure what "locale" means.
> 
> 
> Go to the settings App for whatever version of Windows you are on and
> search for locale.
> 
> > The Windows app is an inhouse application which uses Actian-Zen SQL.
> > The data is exported to simple ASCII in a tab delimited format similar to 
> > CSV.
> 
> 
> And you know it is ASCII for a fact?
> 
> > Those files are then imported into the PostgreSQL table using COPY.
> > Importing the data is not an issue.
> > I am able to load all the data without any problems, even into 1 table 
> > which ends up with about 1.2 billion records.
> > But when I try to update the data in that table I get many errors, 
> > essentially crashes.
> 
> 
> Repeating what has been asked and answered it not really going anywhere.
> 
> > There may be some control characters (garbage) in the data but that should 
> > not crash postgresql, especially if it can import the data without issues.
> 
> 
> Unless it does. That is the point of the questions, getting to what is
> actually causing the issue. Until the problem can be boiled down to a
> reproducible test case there really is not much hope of anything more
> then the the 'yes you have a problem' answer. And there is a difference
> between dumping data into a table and then doing an UPGRADE where the
> data strings are manipulated by functions.
> 
> > Anyway, I hope I answered your questions.
> > Thanks for your help.
> > 
> > On Sunday, April 14th, 2024 at 4:28 PM, Adrian Klaver 
> > adrian.kla...@aklaver.com wrote:
> > 
> > > On 4/14/24 13:18, jack wrote:
> > > 
> > > > The CSV files are being produced by another system, a WIndows app on a
> > > > Windows machine. I then copy them to a USB key and copy them onto the
> > > > ubuntu machine. The data is then imported via the COPY command.
> > > 
> > > The app?
> > > 
> > > The locale in use on the Windows machine?
> > > 
> > > The locale in use in the database?
> > > 
> > > > COPY master (field01,field02..fieldX) FROM '/data/file.text' DELIMITER 
> > > > E'\t'
> > > > The fields are tab delimited.
> > > > 
> > > > But importing the data works. I can get all the data into a single table
> > > > without any problems. The issue is only when I start to update the
> > > > single table. And that is why I started using smaller temporary tables
> > > > for each CSV file, to do the updates in the smaller tables before I move
> > > > them all to a single large table.
> > > 
> > > The import is just dumping the data in, my suspicion is the problem is
> > > related to using string functions on the data.
> > > 
> > > > After all the data is loaded and updated, I run php programs on the
> > > > large table to generate reports. All of which works well EXCEPT for
> > > > performing the updates on the data. And I do not want to use perl or any
> > > > outside tool. I want it all one in SQL because I am required to document
> > > > all my steps so that someone else can take over, so everything needs to
> > > > be as simple as possible.
> > > 
> > > --
> > > Adrian Klaver
> > > adrian.kla...@aklaver.com
> 
> 
> --
> Adrian Klaver
> adrian.kla...@aklaver.com


Reply via email to