On Fri, 2020-03-27 at 16:15 +0100, Ekaterina Amez wrote:
> > You should define primary and foreign keys if you can, but I guess
> > I don't have to tell you that.
>
> Excuse me if this is a silly question but I've read (or understood) that it's
> better
> to remove constraints to improve delete p
>
>
>
> As none of the columns of the joined table are used, most probably
> this should be re-written as an EXISTS condition.
> Then neither GROUP BY nor DISTINCT is needed.
>
>
I need the columns from joined tables. To keep it simple, I didn't include
them in the query. EXISTS solution won't work
Hi Laurenz,
El vie., 27 mar. 2020 a las 15:46, Laurenz Albe ()
escribió:
> On Fri, 2020-03-27 at 15:13 +0100, Ekaterina Amez wrote:
> > I'm trying to clean up a database with millions of records of
> > useless-but-don't-remove-just-in-case data. [...]
> >
> > But also I'm cleaning tables with 150
On Fri, Mar 27, 2020 at 08:41:04AM -0600, Michael Lewis wrote:
> 2) If you are deleting/moving most of the table (91 of 150 million),
> consider moving only the records you are keeping to a new table, renaming
> old table, and renaming new table back to original name. Then you can do
> what you wan
On Fri, Mar 27, 2020 at 10:14 AM Ekaterina Amez
wrote:
>
> it's there a better way to do this. I'm testing on version 9.2 BUT
> production server is 8.4 (legacy application, supposed to be in at least
> 9.2 but recently discovered it was 8.4, planning upgrade but not now).
> Config parameters are
Hi Michael,
El vie., 27 mar. 2020 a las 15:41, Michael Lewis ()
escribió:
> If you can afford the time, I am not sure the reason for the question.
> Just run it and be done with it, yes?
>
I've been working with other RDBMS all of my life and I'm quite new to PG
world, and I'm learning to do th
Sorry, I sent my response only to you, I'm sending it again to the group in
a minute...
El vie., 27 mar. 2020 a las 15:41, Michael Lewis ()
escribió:
> If you can afford the time, I am not sure the reason for the question.
> Just run it and be done with it, yes?
>
> A couple of thoughts-
> 1) Tha
On Fri, 2020-03-27 at 15:13 +0100, Ekaterina Amez wrote:
> I'm trying to clean up a database with millions of records of
> useless-but-don't-remove-just-in-case data. [...]
>
> But also I'm cleaning tables with 150million records where I'm going to
> remove 60% of existing data and after a few t
If you can afford the time, I am not sure the reason for the question. Just
run it and be done with it, yes?
A couple of thoughts-
1) That is a big big transaction if you are doing all the cleanup in a
single function call. Will this be a production system that is still online
for this archiving?
Hello list,
I'm trying to clean up a database with millions of records of
useless-but-don't-remove-just-in-case data. This database has all tables
in public schema so I've created a new schema "old_data" to move there
all this data. I have several tables with 20million of records or so
that I
10 matches
Mail list logo