Hi Victor,

Sorry for the delay in responding. Like I told you the other night on
gtalk, I am unable to get a log on Windows even using g_print as you
suggested instead of g_message. On the Mac, I simply get that opencl is not
being used since there is no shared object to load.

I think a lot of users will be using a single display (like I do using my
laptop) and so, there would have to be some viable solution to problem of
the nvidia driver killing the GPU job, don't you think?

I didn't know about the GEGL_SWAP= RAM option and will add that to the
environment while running Gimp.

Thanks!
Partha



On Tue, Feb 19, 2013 at 1:53 PM, Victor Oliveira <victormath...@gmail.com>wrote:

> Hi Partha,
>
> I couldn't really reproduce your error, so I assume it is because you're
> trying to use the display GPU for processing and to avoid blocking the
> screen, the nvidia driver just kills whatever is running in the GPU.
> Also, I'm not sure if you're already using the GEGL_SWAP=RAM option to run
> GIMP, it should improve performance and maybe you don't get this error.
>
>
> On Sat, Feb 16, 2013 at 12:29 AM, Partha Bagchi <parth...@gmail.com>wrote:
>
>> Hi Victor,
>>
>> I have an NVIDIA GPU with 1G dedicated VRAM,  i7 Core CPU, and 16G of
>> RAM. In any case, 150x150 radius should be doable. In any case, I am sure
>> that when Gimp goes production, it's bound to come up.
>>
>> Any further report on Mac? While I am finding that it's quite fast on an
>> Mac with 8G RAM, I am not sure if the GPU is being used.
>>
>> Note that everything I have seen/ read about OpenCL says that you don't
>> need to load the library at least on a Mac.
>>
>> Thanks,
>> Partha
>>
>>
>>
>>
>>
>> On Fri, Feb 15, 2013 at 7:22 PM, Victor Oliveira <victormath...@gmail.com
>> > wrote:
>>
>>> Hi Partha,
>>>
>>> Thanks for the bug report, I'll give a look when I can.
>>> Notice that a 150x150 radius is very big and maybe yout GPU doesn't have
>>> enough memory for that, but it was supposed to fallback to the CPU in this
>>> case.
>>>
>>> Thanks
>>> Victor
>>>
>>>
>>> On Fri, Feb 15, 2013 at 8:47 AM, Tobias Ellinghaus <h...@gmx.de> wrote:
>>>
>>>> Am Freitag, 15. Februar 2013, 04:52:49 schrub Partha Bagchi:
>>>> > Hi Victor,
>>>> >
>>>> > Latest git. I have an NVIDIA GeForce GT 230M card with 1G VRAM. The
>>>> opengl
>>>> > version is 1.1 CUDA 4.2.1 etc. Windows 7 64bit
>>>> >
>>>> > I think OpenCL is taking down my video every time. Here is a simple
>>>> > repeatable test for me.
>>>> >
>>>> > 1. Open 16 bit tiff.
>>>> > 2. Duplicate layer.
>>>> > 3. Layer - desaturate -> invert
>>>> > 4. GEGL gaussian blur x = y = 150.
>>>> >
>>>> > Takes the screen down (goes black and recovers) and leaves a dark
>>>> tile on
>>>> > the layer.
>>>> >
>>>> > Any ideas?
>>>>
>>>> That is normal when running OpenCL (or CUDA) kernels on a GPU that has a
>>>> monitor connected. On Windows it will time out quite quickly, on Linux
>>>> AFAIK
>>>> not. However, that can be configured. Google is your friend, just
>>>> search for
>>>> "nvidia watchdog".
>>>>
>>>> [...]
>>>>
>>>> > Thanks,
>>>> > Partha
>>>>
>>>> Tobias
>>>>
>>>> _______________________________________________
>>>> gimp-developer-list mailing list
>>>> gimp-developer-list@gnome.org
>>>> https://mail.gnome.org/mailman/listinfo/gimp-developer-list
>>>>
>>>>
>>>
>>> _______________________________________________
>>> gimp-developer-list mailing list
>>> gimp-developer-list@gnome.org
>>> https://mail.gnome.org/mailman/listinfo/gimp-developer-list
>>>
>>>
>>
>
_______________________________________________
gimp-developer-list mailing list
gimp-developer-list@gnome.org
https://mail.gnome.org/mailman/listinfo/gimp-developer-list

Reply via email to