Re: Using COPY to import large xml file

2018-06-24 Thread Tim Cross
On Mon, 25 Jun 2018 at 11:38, Anto Aravinth wrote: > > > On Mon, Jun 25, 2018 at 3:44 AM, Tim Cross wrote: > >> >> Anto Aravinth writes: >> >> > Thanks for the response. I'm not sure, how long does this tool takes for >> > the 70GB data. >> > >> > I used node to stream the xml files into

Re: Using COPY to import large xml file

2018-06-24 Thread Anto Aravinth
On Mon, Jun 25, 2018 at 3:44 AM, Tim Cross wrote: > > Anto Aravinth writes: > > > Thanks for the response. I'm not sure, how long does this tool takes for > > the 70GB data. > > > > I used node to stream the xml files into inserts.. which was very slow.. > > Actually the xml contains 40 million

Re: Using COPY to import large xml file

2018-06-24 Thread Tim Cross
Anto Aravinth writes: > Thanks for the response. I'm not sure, how long does this tool takes for > the 70GB data. > > I used node to stream the xml files into inserts.. which was very slow.. > Actually the xml contains 40 million records, out of which 10Million took > around 2 hrs using

Re: Using COPY to import large xml file

2018-06-24 Thread Christoph Moench-Tegeder
## Adrien Nayrat (adrien.nay...@anayrat.info): > I used this tool : > https://github.com/Networks-Learning/stackexchange-dump-to-postgres That will be awfully slow: this tool commits each INSERT on it's own, see loop in

Re: Using COPY to import large xml file

2018-06-24 Thread Adrian Klaver
On 06/24/2018 08:25 AM, Anto Aravinth wrote: Hello Everyone, I have downloaded the Stackoverflow posts xml (contains all SO questions till date).. the file is around 70GB.. I wanna import the data in those xml to my table.. is there a way to do so in postgres? It is going to require some

Re: Using COPY to import large xml file

2018-06-24 Thread Anto Aravinth
Thanks for the response. I'm not sure, how long does this tool takes for the 70GB data. I used node to stream the xml files into inserts.. which was very slow.. Actually the xml contains 40 million records, out of which 10Million took around 2 hrs using nodejs. Hence, I thought will use COPY

Re: Using COPY to import large xml file

2018-06-24 Thread Adrien Nayrat
On 06/24/2018 06:07 PM, Anto Aravinth wrote: > Thanks for the response. I'm not sure, how long does this tool takes for the > 70GB data. In my memory, it took several hours. I can't remember if it is xml conversion or insert which are longer. > > I used node to stream the xml files into

Re: Using COPY to import large xml file

2018-06-24 Thread Adrien Nayrat
On 06/24/2018 05:25 PM, Anto Aravinth wrote: > Hello Everyone, > > I have downloaded the Stackoverflow posts xml (contains all SO questions till > date).. the file is around 70GB.. I wanna import the data in those xml to my > table.. is there a way to do so in postgres? > > > Thanks,  > Anto.

Using COPY to import large xml file

2018-06-24 Thread Anto Aravinth
Hello Everyone, I have downloaded the Stackoverflow posts xml (contains all SO questions till date).. the file is around 70GB.. I wanna import the data in those xml to my table.. is there a way to do so in postgres? Thanks, Anto.

Re: Suggestion about logging only every n-th statement

2018-06-24 Thread Adrien Nayrat
On 06/20/2018 03:06 PM, Janning Vygen wrote: >> FYI in made this patch which seems do what you want : >> https://www.postgresql.org/message-id/flat/c30ee535-ee1e-db9f-fa97-146b9f62caed%40anayrat.info#c30ee535-ee1e-db9f-fa97-146b9f62c...@anayrat.info >> >> >> I will add an entry in september's