Re: [GENERAL] COPY: row is too big

2017-05-27 Thread doganmeh
Yes, csvkit is what I decided to go with. Thank you all! -- View this message in context: http://www.postgresql-archive.org/COPY-row-is-too-big-tp5936997p5963559.html Sent from the PostgreSQL - general mailing list archive at Nabble.com. -- Sent via pgsql-general mailing list

Re: [GENERAL] COPY: row is too big

2017-05-27 Thread doganmeh
Yes, the delimiter was indeed ",". I fixed my original post . Seems I carelessly copy/pasted from excel. -- View this message in context: http://www.postgresql-archive.org/COPY-row-is-too-big-tp5936997p5963558.html Sent from the PostgreSQL - general mailing list archive at Nabble.com. --

Re: [GENERAL] COPY: row is too big

2017-05-26 Thread Tom Lane
doganmeh writes: > I tried varchar(12) also, nothing changed. My questions is 1) I have > 672x12=8,064 characters in the first row (which are actually the headers), > why would it complain that it is 8760. No, you have 672*13, because each varchar value will require a length

Re: [GENERAL] COPY: row is too big

2017-05-26 Thread Andreas Kretschmer
Am 26.05.2017 um 14:07 schrieb doganmeh: I tried varchar(12) also, nothing changed. My questions is 1) I have 672x12=8,064 characters in the first row (which are actually the headers), why would it complain that it is 8760. I am assuming here type `text` occupies 1 byte for a character.

Re: [GENERAL] COPY: row is too big

2017-05-26 Thread Charles Clavadetscher
Hello > -Original Message- > From: pgsql-general-ow...@postgresql.org > [mailto:pgsql-general-ow...@postgresql.org] On Behalf Of doganmeh > Sent: Freitag, 26. Mai 2017 14:08 > To: pgsql-general@postgresql.org > Subject: Re: [GENERAL] COPY: row is too big >

Re: [GENERAL] COPY: row is too big

2017-05-26 Thread Adrian Klaver
On 05/26/2017 05:07 AM, doganmeh wrote: I am piggy-backing in this thread because I have the same issue as well. I need to import a csv file that is 672 columns long and each column consists of 12 alpha-numeric characters. Such as: SA03ARE1015DSA03ARE1S15NSB03ARE1015D ... 356412

Re: [GENERAL] COPY: row is too big

2017-05-26 Thread doganmeh
BTW, we have pg9.5 run on ubuntu. -- View this message in context: http://www.postgresql-archive.org/COPY-row-is-too-big-tp5936997p5963386.html Sent from the PostgreSQL - general mailing list archive at Nabble.com. -- Sent via pgsql-general mailing list (pgsql-general@postgresql.org) To

Re: [GENERAL] COPY: row is too big

2017-05-26 Thread doganmeh
I am piggy-backing in this thread because I have the same issue as well. I need to import a csv file that is 672 columns long and each column consists of 12 alpha-numeric characters. Such as: SA03ARE1015DSA03ARE1S15NSB03ARE1015D ... 356412 275812 43106 ... I am aware

Re: [GENERAL] COPY: row is too big

2017-01-05 Thread Rob Sargent
On 01/05/2017 11:46 AM, Adrian Klaver wrote: On 01/05/2017 08:31 AM, Rob Sargent wrote: On 01/05/2017 05:44 AM, vod vos wrote: I finally figured it out as follows: 1. modified the corresponding data type of the columns to the csv file 2. if null values existed, defined the data type to

Re: [GENERAL] COPY: row is too big

2017-01-05 Thread Adrian Klaver
On 01/05/2017 08:31 AM, Rob Sargent wrote: On 01/05/2017 05:44 AM, vod vos wrote: I finally figured it out as follows: 1. modified the corresponding data type of the columns to the csv file 2. if null values existed, defined the data type to varchar. The null values cause problem too. so

Re: [GENERAL] COPY: row is too big

2017-01-05 Thread Rob Sargent
On 01/05/2017 05:44 AM, vod vos wrote: I finally figured it out as follows: 1. modified the corresponding data type of the columns to the csv file 2. if null values existed, defined the data type to varchar. The null values cause problem too. so 1100 culumns work well now. This problem

Re: [GENERAL] COPY: row is too big

2017-01-05 Thread Adrian Klaver
On 01/05/2017 04:44 AM, vod vos wrote: I finally figured it out as follows: 1. modified the corresponding data type of the columns to the csv file 2. if null values existed, defined the data type to varchar. The null values cause problem too. Did you change the NULLs to something else? As

Re: [GENERAL] COPY: row is too big

2017-01-05 Thread Pavel Stehule
2017-01-05 13:44 GMT+01:00 vod vos : > I finally figured it out as follows: > > 1. modified the corresponding data type of the columns to the csv file > > 2. if null values existed, defined the data type to varchar. The null > values cause problem too. > int, float, double can

Re: [GENERAL] COPY: row is too big

2017-01-05 Thread vod vos
I finally figured it out as follows: 1. modified the corresponding data type of the columns to the csv file 2. if null values existed, defined the data type to varchar. The null values cause problem too. so 1100 culumns work well now. This problem wasted me three days. I have lots of

Re: [GENERAL] COPY: row is too big

2017-01-04 Thread Adrian Klaver
On 01/04/2017 08:32 AM, Steve Crawford wrote: ... Numeric is expensive type - try to use float instead, maybe double. If I am following the OP correctly the table itself has all the columns declared as varchar. The data in the CSV file is a mix of text, date and numeric,

Re: [GENERAL] COPY: row is too big

2017-01-04 Thread Steve Crawford
... > Numeric is expensive type - try to use float instead, maybe double. >> > > If I am following the OP correctly the table itself has all the columns > declared as varchar. The data in the CSV file is a mix of text, date and > numeric, presumably cast to text on entry into the table. > But a

Re: [GENERAL] COPY: row is too big

2017-01-04 Thread Adrian Klaver
On 01/04/2017 08:00 AM, rob stone wrote: Hello, On Wed, 2017-01-04 at 07:11 -0800, Adrian Klaver wrote: On 01/04/2017 06:54 AM, Pavel Stehule wrote: Hi 2017-01-04 14:00 GMT+01:00 vod vos >: __ Now I am confused about I can create 1100 columns

Re: [GENERAL] COPY: row is too big

2017-01-04 Thread Peter J. Holzer
On 2017-01-04 06:53:31 -0800, Adrian Klaver wrote: > On 01/04/2017 05:00 AM, vod vos wrote: > >Now I am confused about I can create 1100 columns in a table in > >postgresql, but I can't copy 1100 values into the table. And I really > > As pointed out previously: > >

Re: [GENERAL] COPY: row is too big

2017-01-04 Thread rob stone
Hello, On Wed, 2017-01-04 at 07:11 -0800, Adrian Klaver wrote: > On 01/04/2017 06:54 AM, Pavel Stehule wrote: > > Hi > > > > 2017-01-04 14:00 GMT+01:00 vod vos > >: > > > > __ > > Now I am confused about I can create 1100 columns in a table in >

Re: [GENERAL] COPY: row is too big

2017-01-04 Thread Pavel Stehule
2017-01-04 16:11 GMT+01:00 Adrian Klaver : > On 01/04/2017 06:54 AM, Pavel Stehule wrote: > >> Hi >> >> 2017-01-04 14:00 GMT+01:00 vod vos > >: >> >> __ >> Now I am confused about I can create 1100 columns in a table in

Re: [GENERAL] COPY: row is too big

2017-01-04 Thread vod vos
OK, maybe the final solution is to split it into half. On 星期三, 04 一月 2017 06:53:31 -0800 Adrian Klaver adrian.kla...@aklaver.com wrote On 01/04/2017 05:00 AM, vod vos wrote: Now I am confused about I can create 1100 columns in a table in postgresql, but I can't copy 1100

Re: [GENERAL] COPY: row is too big

2017-01-04 Thread Adrian Klaver
On 01/04/2017 06:54 AM, Pavel Stehule wrote: Hi 2017-01-04 14:00 GMT+01:00 vod vos >: __ Now I am confused about I can create 1100 columns in a table in postgresql, but I can't copy 1100 values into the table. And I really dont want to

Re: [GENERAL] COPY: row is too big

2017-01-04 Thread Pavel Stehule
Hi 2017-01-04 14:00 GMT+01:00 vod vos : > Now I am confused about I can create 1100 columns in a table in > postgresql, but I can't copy 1100 values into the table. And I really dont > want to split the csv file to pieces to avoid mistakes after this action. > The PostgreSQL

Re: [GENERAL] COPY: row is too big

2017-01-04 Thread Adrian Klaver
On 01/04/2017 05:00 AM, vod vos wrote: Now I am confused about I can create 1100 columns in a table in postgresql, but I can't copy 1100 values into the table. And I really As pointed out previously: https://www.postgresql.org/about/ Maximum Columns per Table 250 - 1600 depending on

Re: [GENERAL] COPY: row is too big

2017-01-04 Thread vod vos
Now I am confused about I can create 1100 columns in a table in postgresql, but I can't copy 1100 values into the table. And I really dont want to split the csv file to pieces to avoid mistakes after this action. I create a table with 1100 columns with data type of varchar, and hope the COPY

Re: [GENERAL] COPY: row is too big

2017-01-03 Thread John McKown
On Mon, Jan 2, 2017 at 2:57 PM, Rob Sargent wrote: > Perhaps this is your opportunity to correct someone else's mistake. You > need to show the table definition to convince us that it cannot be > improved. That it may be hard work really doesn't mean it's not the right >

Re: [GENERAL] COPY: row is too big

2017-01-02 Thread Rob Sargent
> On Jan 2, 2017, at 10:13 AM, Adrian Klaver wrote: > >> On 01/02/2017 09:03 AM, vod vos wrote: >> You know, the csv file was exported from other database of a machine, so >> I really dont want to break it for it is a hard work. Every csv file >> contains headers and

Re: [GENERAL] COPY: row is too big

2017-01-02 Thread Adrian Klaver
On 01/02/2017 09:03 AM, vod vos wrote: > You know, the csv file was exported from other database of a machine, so > I really dont want to break it for it is a hard work. Every csv file > contains headers and values. If I redesign the table, then I have to cut > all the csv files into pieces one by

Re: [GENERAL] COPY: row is too big

2017-01-02 Thread vod vos
You know, the csv file was exported from other database of a machine, so I really dont want to break it for it is a hard work. Every csv file contains headers and values. If I redesign the table, then I have to cut all the csv files into pieces one by one. On 星期一, 02 一月 2017 08:21:29

Re: [GENERAL] COPY: row is too big

2017-01-02 Thread Tom Lane
vod vos writes: > When I copy data from csv file, a very long values for many columns (about > 1100 columns). The errors appears: > ERROR: row is too big: size 11808, maximum size 8160 You need to rethink your table schema so you have fewer columns. Perhaps you can combine

Re: [GENERAL] COPY: row is too big

2017-01-02 Thread vod vos
The most of the data type are text or varhcar, and I use: COPY rius FROM "/var/www/test/aa.csv" WITH DELIMITER ';' ; And some the values in the csv file contain nulls, do this null values matter? Thanks. On 星期一, 02 一月 2017 03:11:14 -0800 vod vos vod...@zoho.com wrote

Re: [GENERAL] COPY: row is too big

2017-01-02 Thread Adrian Klaver
On 01/02/2017 03:11 AM, vod vos wrote: Hi everyone, My postgresql is 9.61. When I copy data from csv file, a very long values for many columns (about 1100 columns). The errors appears: My guess is you are tripping this: https://www.postgresql.org/about/ Maximum Columns per Table 250 -

Re: [GENERAL] COPY: row is too big

2017-01-02 Thread John McKown
On Mon, Jan 2, 2017 at 5:11 AM, vod vos wrote: > Hi everyone, > > My postgresql is 9.61. > > When I copy data from csv file, a very long values for many columns (about > 1100 columns). The errors appears: > > > ERROR: row is too big: size 11808, maximum size 8160CONTEXT: > >

[GENERAL] COPY: row is too big

2017-01-02 Thread vod vos
Hi everyone, My postgresql is 9.61. When I copy data from csv file, a very long values for many columns (about 1100 columns). The errors appears: ERROR: row is too big: size 11808, maximum size 8160CONTEXT: COPY rius, line 2 rius is the table. I have searched the mailing