Re: [GENERAL] Using the GPU

2007-06-16 Thread Tom Allison
On Jun 11, 2007, at 4:31 AM, Alban Hertroys wrote: Alexander Staubo wrote: On 6/8/07, Billings, John [EMAIL PROTECTED] wrote: If so which part of the database, and what kind of parallel algorithms would be used? GPUs are parallel vector processing pipelines, which as far as I can tell

Re: [GENERAL] Using the GPU

2007-06-16 Thread Alexander Staubo
On 6/16/07, Tom Allison [EMAIL PROTECTED] wrote: It might make an interesting project, but I would be really depressed if I had to go buy an NVidia card instead of investing in more RAM to optimize my performance! g Why does it matter what kind of hardware you can (not have to) buy to give

Re: [GENERAL] Using the GPU

2007-06-16 Thread Tom Lane
Alexander Staubo [EMAIL PROTECTED] writes: On 6/16/07, Tom Allison [EMAIL PROTECTED] wrote: It might make an interesting project, but I would be really depressed if I had to go buy an NVidia card instead of investing in more RAM to optimize my performance! g Why does it matter what kind of

Re: [GENERAL] Using the GPU

2007-06-16 Thread Gregory Stark
Tom Lane [EMAIL PROTECTED] writes: So you can't just claim that using a GPU might be interesting; you have to persuade people that it's more interesting than other places where we could spend our performance-improvement efforts. I have a feeling something as sexy as that could attract new

Re: [GENERAL] Using the GPU

2007-06-16 Thread Tom Allison
Tom Lane wrote: Alexander Staubo [EMAIL PROTECTED] writes: On 6/16/07, Tom Allison [EMAIL PROTECTED] wrote: It might make an interesting project, but I would be really depressed if I had to go buy an NVidia card instead of investing in more RAM to optimize my performance! g Why does it

Re: [GENERAL] Using the GPU

2007-06-16 Thread Alexander Staubo
On 6/16/07, Tom Lane [EMAIL PROTECTED] wrote: Alexander Staubo [EMAIL PROTECTED] writes: On 6/16/07, Tom Allison [EMAIL PROTECTED] wrote: It might make an interesting project, but I would be really depressed if I had to go buy an NVidia card instead of investing in more RAM to optimize my

Re: [GENERAL] Using the GPU

2007-06-13 Thread Alejandro Torras
Billings, John wrote: Does anyone think that PostgreSQL could benefit from using the video card as a parallel computing device? I'm working on a project using Nvidia's CUDA with an 8800 series video card to handle non-graphical algorithms. I'm curious if anyone thinks that this technology

Re: [GENERAL] Using the GPU

2007-06-13 Thread Alejandro Torras
Alejandro Torras wrote: Billings, John wrote: Does anyone think that PostgreSQL could benefit from using the video card as a parallel computing device? I'm working on a project using Nvidia's CUDA with an 8800 series video card to handle non-graphical algorithms. I'm curious if anyone

Re: [GENERAL] Using the GPU

2007-06-11 Thread Alban Hertroys
Alexander Staubo wrote: On 6/8/07, Billings, John [EMAIL PROTECTED] wrote: If so which part of the database, and what kind of parallel algorithms would be used? GPUs are parallel vector processing pipelines, which as far as I can tell do not lend themselves right away to the data

[GENERAL] Using the GPU

2007-06-08 Thread Billings, John
Does anyone think that PostgreSQL could benefit from using the video card as a parallel computing device? I'm working on a project using Nvidia's CUDA with an 8800 series video card to handle non-graphical algorithms. I'm curious if anyone thinks that this technology could be used to speed up a

Re: [GENERAL] Using the GPU

2007-06-08 Thread Alexander Staubo
On 6/8/07, Billings, John [EMAIL PROTECTED] wrote: Does anyone think that PostgreSQL could benefit from using the video card as a parallel computing device? I'm working on a project using Nvidia's CUDA with an 8800 series video card to handle non-graphical algorithms. I'm curious if anyone

Re: [GENERAL] Using the GPU

2007-06-08 Thread Dawid Kuroczko
On 6/8/07, Billings, John [EMAIL PROTECTED] wrote: Does anyone think that PostgreSQL could benefit from using the video card as a parallel computing device? I'm working on a project using Nvidia's CUDA with an 8800 series video card to handle non-graphical algorithms. I'm curious if