On 2 Feb 2005 at 16:04, Marcelo Via Giglio wrote:

> I am new with dt:\\dig, trying to index a URL with PHP files, like a
> calendar, htdig go to a infinite loop, calling the file.php with
> diferent paramenters, how can I solve this situation?, because theare
> many php files in the URLs of domain I can't rectrict some files, are
> there a general solution to avoid recursion?

You can restrict access with robots.txt (a file in the top level 
directory of your website) or with various htDig configuration items:

  exclude_urls, limit_urls_to

As of 3.2.0 beta, you can use regex in these expressions.

-- 
Dan Langille : http://www.langille.org/
BSDCan - The Technical BSD Conference - http://www.bsdcan.org/



-------------------------------------------------------
This SF.Net email is sponsored by: IntelliVIEW -- Interactive Reporting
Tool for open source databases. Create drag-&-drop reports. Save time
by over 75%! Publish reports on the web. Export to DOC, XLS, RTF, etc.
Download a FREE copy at http://www.intelliview.com/go/osdn_nl
_______________________________________________
ht://Dig general mailing list: <[email protected]>
ht://Dig FAQ: http://htdig.sourceforge.net/FAQ.html
List information (subscribe/unsubscribe, etc.)
https://lists.sourceforge.net/lists/listinfo/htdig-general

Reply via email to