Ok did a little more checking and found out the exact cuase, it seems a null
user-agent field in a robots.txt causes the seg fault:
New server: <a server with a webmaster who doesn't know how to write a robots.txt>, 80
- Persistent connections: enabled
- HEAD before GET: disabled
- Timeout: 30
- Connection space: 0
- Max Documents: -1
- TCP retries: 1
- TCP wait time: 5
Trying to retrieve robots.txt file
Parsing robots.txt file using myname = htdig
Found 'user-agent' line:
Pattern:
Program received signal SIGSEGV, Segmentation fault.
0x0 in ?? ()
--- Alex
Alexander Cohen wrote:
> Hi, has anyone managed to succesfully compile any of the htdig-3.2.0b3 betas under
> OSF1 v4.0
> and have it run ok without imvariably doing this in htdig:
>
> bash# ./rundig
> ./rundig: 8745 Memory fault - core dumped
>
> ---- Alex
>
> --
> -----------------------------------
> Alexander Cohen
> La Trobe University - ITS
> [EMAIL PROTECTED]
> (03) 9479-5580
> -----------------------------------
>
> _______________________________________________
> htdig-general mailing list
> [EMAIL PROTECTED]
> http://lists.sourceforge.net/lists/listinfo/htdig-general
--
-----------------------------------
Alexander Cohen
La Trobe University - ITS
[EMAIL PROTECTED]
(03) 9479-5580
-----------------------------------
_______________________________________________
htdig-general mailing list
[EMAIL PROTECTED]
http://lists.sourceforge.net/lists/listinfo/htdig-general