But the crawl-urlfilter.txt not accept only characters instead of strings?
If accepted, as I write?
# Skip URLs containing certain characters as probable queries, etc..
-[...@=]
Could be?
# Skip URLs containing certain characters as probable queries, etc..
- [ "menu"]
Thanks
Que
hello, I have a problem.
You can configure how the nutch (crawl or searc ) creates the summaries?
--
View this message in context:
http://old.nabble.com/Summary-tp27731301p27731301.html
Sent from the Nutch - User mailing list archive at Nabble.com.
Please could someone tell me how to not get the crawl URLs that contain the
word "menu."
Thanks
--
View this message in context:
http://old.nabble.com/String-%22menu%22-tp27693743p27693743.html
Sent from the Nutch - User mailing list archive at Nabble.com.
Please could someone tell me how to not get the crawl URLs that contain the
word "menu."
Thanks
--
View this message in context:
http://old.nabble.com/String-%22menu%22-tp27692447p27692447.html
Sent from the Nutch - User mailing list archive at Nabble.com.
Looking for a solution to the following subject:
My research will be available to both internal and external audiences. But
the outside public can not see the contents of a particular directory.
I use two indices? Otherwise (using the same indicator) how to prevent the
public does not see or ac