For the past few months we have been experimenting with using MySQL with Apache OFBiz. However, we are not sure that we can live with the performance.

We have a large dataset that we wanted to import into PostgreSQL, but it seems to fail no matter what we do. We tried ever nuance we could to get this file loaded.

Admin_=# COPY DATA_IMPORT_CUSTOMER FROM '/tmp/sold.pg' delimiter '|' null '';
ERROR:  missing data for column "processed_timestamp"
CONTEXT: COPY data_import_customer, line 1: "(xxx) xxx-xxxx||Mary|Smith|76 Crest Street||Jersey City|NJ|07302-0000|New Jersey|USA||XXX|XXX-1..."

The data looks like this

(xxx) xxx-xxxx||Mary|Smith|76 Crest Street||Jersey City|NJ|07302-0000|New J
ersey|USA||XXX|XXX-XXXX|||||||40401234567890|0604|.00|0.00|||||||||||||

The problem is I need to load a field that is allowed to be null with nothing and it work....

                  Table "public.data_import_customer"
             Column              |           Type           | Modifiers
---------------------------------+--------------------------+-----------
 customer_id                     | character varying(20)    | not null
 company_name                    | character varying(100)   |
 first_name                      | character varying(100)   |
 last_name                       | character varying(100)   |
 address1                        | character varying(255)   |
 address2                        | character varying(255)   |
 city                            | character varying(100)   |
 state_province_geo_id           | character varying(20)    |
 postal_code                     | character varying(60)    |
 state_province_geo_name         | character varying(20)    |
 country_geo_id                  | character varying(20)    |
 primary_phone_country_code      | character varying(10)    |
 primary_phone_area_code         | character varying(10)    |
 primary_phone_number            | character varying(60)    |
 secondary_phone_country_code    | character varying(10)    |
 secondary_phone_area_code       | character varying(10)    |
 secondary_phone_number          | character varying(60)    |
 fax_country_code                | character varying(10)    |
 fax_area_code                   | character varying(10)    |
 fax_number                      | character varying(60)    |
 credit_card_number              | character varying(60)    |
 credit_card_exp_date            | character varying(60)    |
 outstanding_balance             | numeric(18,2)            |
 billing_account_limit           | numeric(18,2)            |
 ship_to_company_name            | character varying(100)   |
 ship_to_first_name              | character varying(100)   |
 ship_to_last_name               | character varying(100)   |
 ship_to_address1                | character varying(255)   |
 ship_to_address2                | character varying(255)   |
 ship_to_city                    | character varying(100)   |
 ship_to_state_province_geo_id   | character varying(20)    |
 ship_to_postal_code             | character varying(60)    |
 ship_to_state_province_geo_name | character varying(20)    |
 ship_to_country_geo_id          | character varying(20)    |
 processed_timestamp             | timestamp with time zone |
 primary_party_id                | character varying(20)    |
 company_party_id                | character varying(20)    |
 person_party_id                 | character varying(20)    |
 last_updated_stamp              | timestamp with time zone |
 last_updated_tx_stamp           | timestamp with time zone |
 created_stamp                   | timestamp with time zone |
 created_tx_stamp                | timestamp with time zone |
Indexes:
    "pk_data_import_customer" PRIMARY KEY, btree (customer_id)
    "dt_impt_cstr_txcrs" btree (created_tx_stamp)
    "dt_impt_cstr_txstp" btree (last_updated_tx_stamp)


Thanks!
--
Walter

---------------------------(end of broadcast)---------------------------
TIP 1: if posting/reading through Usenet, please send an appropriate
      subscribe-nomail command to [EMAIL PROTECTED] so that your
      message can get through to the mailing list cleanly

Reply via email to