> On 17 Mar 2021, at 22:02, Richard Hainsworth <rnhainswo...@gmail.com> wrote:
> 
> After working at this, I finally found where it was happening, and a 
> work-around.
> 
> I was checking all 517 http/s links in the documentation to see whether they 
> are all live (not 404 and host found). For this I was using LibCurl::Easy.
> 
> The relevant bits of code was something like
> 
> for @links -> $link { # @links.elems = 517
> 
>     my $http = LibCurl::HTTP.new;
> 
>     try { $rv = $http.HEAD($link).perform.response-code }
> 
>     if $! { $rv = $http.error }
> 
> }
> 
> I had assumed that as soon as the $http went out of scope, it would be 
> collected.
> 
> When I rewrote as
> 
> my $http = LibCurl::HTTP.new;
> 
> for @links -> link { ... }
> 
> then the program ran to completion without failing with 'too many file 
> handles.

Well, that appears to be a good thing to do anyway, not redoing things that 
don't need redoing unnecessarily.


> Now my question is this: Is this a problem with the LibCurl::Easy module, 
> where I can raise an issue.
> 
> Or is this a more general Raku problem with objects not being garbage 
> collected?

There is no reference counting in Raku.  Objects get garbage collected when 
they are garbage collected.  When that exactly happens, is indeterminate.

I've written a blog post about that a few years ago, now in the CCR repo: 
https://github.com/Raku/CCR/blob/main/Remaster/Elizabeth%20Mattijsen/Garbage-Collection-in-Raku.md


Liz

Reply via email to