Re: [SQL] SQL doubt - Date Add
Gaetano Mendola wrote: sreejith s wrote: Hi friends, I problem is i want to add a specified no. of years to a given date ie, 12/12/2004 + 5 = 12/12/2009. This has to be done via sql. Any predefined function there for the same. or v have split the yr and then add. Sreejith # select '2004-12-12'::date + 5 * '1 year'::interval; ?column? - 2009-12-12 00:00:00 (1 row) But be aware of... richardh=# SELECT '2004-02-28'::date + '1 year'::interval; ?column? - 2005-02-28 00:00:00 (1 row) richardh=# SELECT '2004-02-29'::date + '1 year'::interval; ?column? - 2005-02-28 00:00:00 (1 row) richardh=# SELECT '2004-02-29'::date + '4 years'::interval; ?column? - 2008-02-29 00:00:00 (1 row) -- Richard Huxton Archonet Ltd ---(end of broadcast)--- TIP 9: the planner will ignore your desire to choose an index scan if your joining column's datatypes do not match
Re: [SQL] Export tab delimited from mysql to postgres.
In article <[EMAIL PROTECTED]>, Theo Galanakis <[EMAIL PROTECTED]> writes: > Has anyone been able to export and import a tab delimited file from mysql to > postgres successfully? > This is my dilemma which I have posted on mysql sites with no response, > however postgres users may have come across the same issue: > Trying to export a tab delimited file with the following command, however it > does not appear to strip out the CR/LF within a varchar column. > select * into outfile '/tmp/users.txt' FIELDS ESCAPED BY '\\' LINES TERMINATED > BY '\r\n' from users; It's a MySQL bug/misfeature that it doesn't translate CRs into '\r'. Nevertheless, you can import a MySQL tab delimited file into PostgreSQL - just pipe it through a small sed script. ---(end of broadcast)--- TIP 4: Don't 'kill -9' the postmaster
[SQL] table update using result from join
Hi folks. I'm annoyed that I can't sus out something this simple. The join below gives the output I want, but I can't work out how to convert it to be an update on the requests field. I want to append the customer name to the fuel (delivery details) field and then update the customer field to be 'retail'. but I can't work out the description update bit. goole=# select r.r_fuel||' - Customer: '||c.c_des from requests r, customers c goole-# where r.r_c_id = c.c_id and c.c_id in ( goole(# 9, 10, 13, 27, 35, 36, 39, 42, 44, 51, 53, 54, 55, 56, 57, 58, 60, 65, 67, 69, goole(# 74, 75, 77, 81, 82, 84, 88, 89, 90, 91, 92, 96, 98, 99, 100, 101, 102, 103, goole(# 105, 108, 113, 114, 117, 118, 124, 125, 126, 131, 132, 135, 136, 137, 144, goole(# 145, 148, 149, 150, 151, 154, 11, 37, 40, 41, 43, 48, 52, 59, 62, goole(# 63, 68, 70, 71, 83, 86, 93, 104, 119, 120, 121, 123, 128, 129, 130, goole(# 134, 138, 142, 146, 147, 152, 153, 19, 38, 85, 87, 94, 106, 112, 116, goole(# 141, 143, 18, 110, 111, 115, 140, 24, 50, 133, 47, 64, 76, 95, 107, goole(# 109, 127, 33, 46, 97); ?column? - RINGWAYS - Customer: SUBSCAN RINGWAYS - Customer: MARTIN ? - Customer: N G BAILEY ? - Customer: CHECRON SITE SERVICES LTD 10 litres - Customer: CONSULTANT SERVICES ? - Customer: ALD AUTOMOTIVE eskrigg yo19 6ez 9am - Customer: TRANSPORT MANAGEMENT TRAILER / PE18 9UH / HALF ON DELIVERY - Customer: ALD AUTOMOTIVE DRIVEN / WF16 0NF - Customer: CONSULTANT SERVICES TRAILER / HALF TANK ON DELIVERY - Customer: LEX / SWINTON DN14 0HR - Customer: CHECRON SITE SERVICES LTD . - Customer: other fleet [snip] (256 rows) goole=# What I want is something like update requests set r_fuel=, r_c_id = 7 where r_c_id in (... -- Gary Stainburn This email does not contain private or confidential material as it may be snooped on by interested government parties for unknown and undisclosed purposes - Regulation of Investigatory Powers Act, 2000 ---(end of broadcast)--- TIP 9: the planner will ignore your desire to choose an index scan if your joining column's datatypes do not match
Re: [SQL] How do FKs work?
Am Sonntag, 10. Oktober 2004 15:01 schrieb Marc G. Fournier: > On Sun, 10 Oct 2004, Janning Vygen wrote: > > Am Sonntag, 10. Oktober 2004 02:43 schrieb Marc G. Fournier: > >> On Sat, 9 Oct 2004, Tom Lane wrote: > >>> "Marc G. Fournier" <[EMAIL PROTECTED]> writes: > Have a table with two FKs on it ... 2 different fields in the table > point to the same field in another table ... > > When I do an 'EXPLAIN ANALYZE DELETE FROM table WHERE field = #;', it > never comes back ... or, at lesat, takes a *very* long time ... > >>> > >>> Do you have indexes on the referencing columns? Are they exactly the > >>> same datatype as the referenced column? You can get really awful plans > >>> for the FK-checking queries if not. > >> > >> Yup, that was my first thought ... running SELECT's joining the two > >> tables on the FK fields shows indices being used, and fast times ... Could you please show me your schema design regarding those two tables. I had this problem too and it just lacks from an index on the foreign key. janning ---(end of broadcast)--- TIP 4: Don't 'kill -9' the postmaster
Re: [SQL] Export tab delimited from mysql to postgres.
Title: RE: [SQL] Export tab delimited from mysql to postgres. Could you provide a example of how to do this? I actually ended up exporting the data as Insert statements, which strips out cf/lf within varchars. However it takes an eternity to import 200,000 records... 24 hours infact Is this normal? Theo -Original Message- From: Harald Fuchs [mailto:[EMAIL PROTECTED]] Sent: Monday, 11 October 2004 7:34 PM To: [EMAIL PROTECTED] Subject: Re: [SQL] Export tab delimited from mysql to postgres. In article <[EMAIL PROTECTED]>, Theo Galanakis <[EMAIL PROTECTED]> writes: > Has anyone been able to export and import a tab delimited file from > mysql to postgres successfully? This is my dilemma which I have posted > on mysql sites with no response, however postgres users may have come > across the same issue: > Trying to export a tab delimited file with the following command, > however it does not appear to strip out the CR/LF within a varchar > column. select * into outfile '/tmp/users.txt' FIELDS ESCAPED BY '\\' > LINES TERMINATED BY '\r\n' from users; It's a MySQL bug/misfeature that it doesn't translate CRs into '\r'. Nevertheless, you can import a MySQL tab delimited file into PostgreSQL - just pipe it through a small sed script. ---(end of broadcast)--- TIP 4: Don't 'kill -9' the postmaster __This email, including attachments, is intended only for the addresseeand may be confidential, privileged and subject to copyright. If youhave received this email in error, please advise the sender and deleteit. If you are not the intended recipient of this email, you must notuse, copy or disclose its content to anyone. You must not copy or communicate to others content that is confidential or subject to copyright, unless you have the consent of the content owner.
Re: [SQL] Export tab delimited from mysql to postgres.
On Tue, Oct 12, 2004 at 09:23:41 +1000, Theo Galanakis <[EMAIL PROTECTED]> wrote: > > Could you provide a example of how to do this? > > I actually ended up exporting the data as Insert statements, which > strips out cf/lf within varchars. However it takes an eternity to import > 200,000 records... 24 hours infact Is this normal? Are you doing the load in one transaction? ---(end of broadcast)--- TIP 7: don't forget to increase your free space map settings
Re: [SQL] Export tab delimited from mysql to postgres.
Quoth [EMAIL PROTECTED] (Theo Galanakis): > Could you provide a example of how to do this? > > I actually ended up exporting the data as Insert statements, > which strips out cf/lf within varchars. However it takes an eternity > to import 200,000 records... 24 hours infact Is this normal? I expect that this results from each INSERT being a separate transaction. If you put a BEGIN at the start and a COMMIT at the end, you'd doubtless see an ENORMOUS improvement. That's not even the _big_ improvement, either. The _big_ improvement would involve reformatting the data so that you could use the COPY statement, which is _way_ faster than a bunch of INSERTs. Take a look at the documentation to see the formatting that is needed: http://techdocs.postgresql.org/techdocs/usingcopy.php http://www.faqs.org/docs/ppbook/x5504.htm http://www.postgresql.org/docs/7.4/static/sql-copy.html -- output = ("cbbrowne" "@" "ntlug.org") http://www3.sympatico.ca/cbbrowne/lsf.html Question: How many surrealists does it take to change a light bulb? Answer: Two, one to hold the giraffe, and the other to fill the bathtub with brightly colored machine tools. ---(end of broadcast)--- TIP 2: you can get off all lists at once with the unregister command (send "unregister YourEmailAddressHere" to [EMAIL PROTECTED])
[SQL]
unsubscribe pgsql-sql Mensagem Enviada utilizando o Onda Mail. http://www.onda.com.br Onda Provedor de Servicos S/A ---(end of broadcast)--- TIP 5: Have you checked our extensive FAQ? http://www.postgresql.org/docs/faqs/FAQ.html
Re: [SQL] Export tab delimited from mysql to postgres.
Title: RE: [SQL] Export tab delimited from mysql to postgres. Thanks for all your comments, I have beent trying the insert within a transaction block, however it does not seem to reduce the time it takes to process each records. Mind you there are 80 column and the insert statement explicitly defines the column to insert into. I need any tip I can get help me transform the text file into a format postgres copy will successfully read. Here is sample of the current format of a mysql tab delimited dump.. columnA columnB 1 What a day! 2 What a week it has been! 3 What the! As you can see row 2 has a value that holds a CR value which ends up wrapping around onto the third line. Postgres copy command does not like this and mysql is unable to replace the value with another type of delimiter, like a \r. So I gather I have to some how manually replace the carriage return with something postgres understand \r... columnA columnB 1 What a day! 2 What a week it has \r been! 3 What the! How do I do this without getting a text file that looks like this 1 What a day! \r\n2 What a week it has \r been!\r\n3 What the!\r\n Any help would be appreciated. Theo -Original Message- From: Christopher Browne [mailto:[EMAIL PROTECTED]] Sent: Tuesday, 12 October 2004 10:46 AM To: [EMAIL PROTECTED] Subject: Re: [SQL] Export tab delimited from mysql to postgres. Quoth [EMAIL PROTECTED] (Theo Galanakis): > Could you provide a example of how to do this? > > I actually ended up exporting the data as Insert statements, > which strips out cf/lf within varchars. However it takes an eternity > to import 200,000 records... 24 hours infact Is this normal? I expect that this results from each INSERT being a separate transaction. If you put a BEGIN at the start and a COMMIT at the end, you'd doubtless see an ENORMOUS improvement. That's not even the _big_ improvement, either. The _big_ improvement would involve reformatting the data so that you could use the COPY statement, which is _way_ faster than a bunch of INSERTs. Take a look at the documentation to see the formatting that is needed: http://techdocs.postgresql.org/techdocs/usingcopy.php http://www.faqs.org/docs/ppbook/x5504.htm http://www.postgresql.org/docs/7.4/static/sql-copy.html -- output = ("cbbrowne" "@" "ntlug.org") http://www3.sympatico.ca/cbbrowne/lsf.html Question: How many surrealists does it take to change a light bulb? Answer: Two, one to hold the giraffe, and the other to fill the bathtub with brightly colored machine tools. ---(end of broadcast)--- TIP 2: you can get off all lists at once with the unregister command (send "unregister YourEmailAddressHere" to [EMAIL PROTECTED]) __This email, including attachments, is intended only for the addresseeand may be confidential, privileged and subject to copyright. If youhave received this email in error, please advise the sender and deleteit. If you are not the intended recipient of this email, you must notuse, copy or disclose its content to anyone. You must not copy or communicate to others content that is confidential or subject to copyright, unless you have the consent of the content owner.