Re: [GENERAL] import CSV file to a table

2017-03-08 Thread Günce Kaya
Hi, When I open a new thread, I didn't know exactly what is true words to research. I read all of your posts and I think CSV parsing is the point to me. I've created my script and I share it to record. #cat cargo2.sh #!/bin/bash while IFS=, read uor_desc crime_type zip_code ucr_hierarchy

Re: [GENERAL] import CSV file to a table

2017-03-08 Thread Karl Czajkowski
On Mar 08, Rob Sargent modulated: > Yes Karl, I agree. I admitted as much. But if it's clean, as in > free of quoted commas, life is much more simple. I've lost site of > whether or not the OP knows his situation w.r.t. to this. The awk > line will tell him and for a one-off load this can

Re: [GENERAL] import CSV file to a table

2017-03-08 Thread Rob Sargent
On 03/08/2017 09:52 AM, Karl Czajkowski wrote: On Mar 08, Rob Sargent modulated: Yes Karl, I agree. I admitted as much. But if it's clean, as in free of quoted commas, life is much more simple. I've lost site of whether or not the OP knows his situation w.r.t. to this. The awk line will

Re: [GENERAL] import CSV file to a table

2017-03-08 Thread Bret Stern
I'll throw in. If tab delimited is available, perhaps that option will work better...or.. use Access to find the violations of the quote comma delimited assumptions, then export from Access an import Bret On Wed, 2017-03-08 at 08:36 -0800, Karl Czajkowski wrote: > I believe that in its fully

Re: [GENERAL] import CSV file to a table

2017-03-08 Thread Rob Sargent
On 03/08/2017 09:36 AM, Karl Czajkowski wrote: I believe that in its fully glory, you cannot reliably locate CSV record boundaries except by parsing each field in order including quote processing. Individual records may have arbitrary numbers of field and record separator characters within

Re: [GENERAL] import CSV file to a table

2017-03-08 Thread Karl Czajkowski
I believe that in its fully glory, you cannot reliably locate CSV record boundaries except by parsing each field in order including quote processing. Individual records may have arbitrary numbers of field and record separator characters within the values. Karl On Mar 08, Rob Sargent modulated:

Re: [GENERAL] import CSV file to a table

2017-03-08 Thread Rob Sargent
Since bash has been bandied about in this thread I presume awk is available. Here's how I would check just how 'csv'ish the incoming file is. awk -F"," '{a[$NF]++}END{for(i in a){printf "%d lines have %d fields(columns)\n", a[i], i}}' csvfilename If this doesn't produce one line you

Re: [GENERAL] import CSV file to a table

2017-03-08 Thread David G. Johnston
On Wed, Mar 8, 2017 at 9:13 AM, Karl Czajkowski wrote: > > With the temporary table, you can use SQL for most validation or data > interrogation, but you need to know at least enough schema information > in advance to form the COPY statement. Parsing the CSV header row to > plan

Re: [GENERAL] import CSV file to a table

2017-03-08 Thread Karl Czajkowski
On Mar 08, John McKown modulated: ... > ​I agree. I went with a "pure BASH" approach because it is what the > user asked for & I wasn't sure what language she might be comfortable > with. I use PERL a lot. Or maybe I should say that I abuse PERL a lot. > Such as a PERL script with writes out

Re: [GENERAL] import CSV file to a table

2017-03-08 Thread John McKown
On Wed, Mar 8, 2017 at 8:45 AM, vinny wrote: > On 2017-03-08 10:13, Günce Kaya wrote: > >> Hi all, >> >> I want to import content of CSV file to a table via bash script >> without creating temporary table and I also want to skip some columns >> in CSV file (for instance, CSV

Re: [GENERAL] import CSV file to a table

2017-03-08 Thread vinny
On 2017-03-08 10:13, Günce Kaya wrote: Hi all, I want to import content of CSV file to a table via bash script without creating temporary table and I also want to skip some columns in CSV file (for instance, CSV file has 12 column and main table has only 2 column, If possible I would use only 2

Re: [GENERAL] import CSV file to a table

2017-03-08 Thread Martijn Tonies (Upscene Productions)
, MS SQL Server, PostgreSQL, SQL Anywhere, MySQL, InterBase, NexusDB and Firebird. From: Günce Kaya Sent: Wednesday, March 08, 2017 10:13 AM To: pgsql-general@postgresql.org Subject: [GENERAL] import CSV file to a table Hi all, I want to import content of CSV file to a table via bash script

Re: [GENERAL] import CSV file to a table

2017-03-08 Thread John McKown
On Wed, Mar 8, 2017 at 3:13 AM, Günce Kaya wrote: > Hi all, > > I want to import content of CSV file to a table via bash script without > creating temporary table and I also want to skip some columns in CSV file > (for instance, CSV file has 12 column and main table has

Re: [GENERAL] import CSV file to a table

2017-03-08 Thread Vick Khera
Since you're using bash, I will assume you are not averse to using a slightly complicated pipeline. First, install this: https://github.com/wireservice/csvkit Then use that to cut out the columns, you want and pipe the result into psql with an appropriate \copy command. On Wed, Mar 8, 2017 at

[GENERAL] import CSV file to a table

2017-03-08 Thread Günce Kaya
Hi all, I want to import content of CSV file to a table via bash script without creating temporary table and I also want to skip some columns in CSV file (for instance, CSV file has 12 column and main table has only 2 column, If possible I would use only 2 column in CSV file) Is there any way to