Author: qboosh Date: Thu Jul 20 21:03:12 2006 GMT Module: SPECS Tag: HEAD ---- Log message: - cleanups, missing dir
---- Files affected: SPECS: perl-WWW-RobotRules-Parser.spec (1.1 -> 1.2) ---- Diffs: ================================================================ Index: SPECS/perl-WWW-RobotRules-Parser.spec diff -u SPECS/perl-WWW-RobotRules-Parser.spec:1.1 SPECS/perl-WWW-RobotRules-Parser.spec:1.2 --- SPECS/perl-WWW-RobotRules-Parser.spec:1.1 Wed Jul 12 22:51:08 2006 +++ SPECS/perl-WWW-RobotRules-Parser.spec Thu Jul 20 23:03:07 2006 @@ -2,22 +2,23 @@ # # Conditional build: %bcond_without autodeps # don't BR packages needed only for resolving deps -%bcond_without tests # do not perform "make test" +%bcond_without tests # do not perform "make test" # %include /usr/lib/rpm/macros.perl %define pdir WWW %define pnam RobotRules-Parser -Summary: perl(WWW::RobotRules-Parser) +Summary: WWW::RobotRules::Parser - parse robots.txt +Summary(pl): WWW::RobotRules::Parser - analiza robots.txt Name: perl-WWW-RobotRules-Parser Version: 0.02 Release: 0.2 -# "same as perl" (as in META.yml +# "same as perl" (as in META.yml) License: GPLv1 or Artistic Group: Development/Languages/Perl Source0: http://www.cpan.org/modules/by-module/%{pdir}/%{pdir}-%{pnam}-%{version}.tar.gz # Source0-md5: a7b77fdd5084de92522bacb2049ef254 Patch0: %{name}-Makefile.PL.patch -URL: http://search.cpan.org/dist/WWW-RobotRules-Parser +URL: http://search.cpan.org/dist/WWW-RobotRules-Parser/ BuildRequires: perl-devel >= 1:5.8.0 BuildRequires: rpm-perlprov >= 4.1-13 %if %{with autodeps} || %{with tests} @@ -26,21 +27,21 @@ BuildArch: noarch BuildRoot: %{tmpdir}/%{name}-%{version}-root-%(id -u -n) -#%define _noautoreq 'perl(anything_fake_or_conditional)' - %description WWW::RobotRules::Parser allows you to simply parse robots.txt files as -described in http://www.robotstxt.org/wc/norobots.html. Unlike WWW::RobotRules -(which is very cool), this module does not take into consideration your user -agent name when parsing. It just parses the structure and returns a hash -containing the whole set of rules. You can then use this to do whatever you -like with it. +described in <http://www.robotstxt.org/wc/norobots.html>. Unlike +WWW::RobotRules (which is very cool), this module does not take into +consideration your user agent name when parsing. It just parses the +structure and returns a hash containing the whole set of rules. You +can then use this to do whatever you like with it. %description -l pl -WWW::RobotRules::Parser pozwala na proste parsowanie pliku robot.txt jak opisano w -http://www.robotstxt.org/wc/norobots.html, ponadto w przeciwienstwie do WWW::RobotRules -ten modul nie analizuje podanej nazwy przegladrki, a jedynnie parsuje strukture i zwraca -tablice asocjacyjna z peĹnym zestawem regĂłĹek. +WWW::RobotRules::Parser pozwala na proste analizowanie pliku +robots.txt zgodnie z opisem w +<http://www.robotstxt.org/wc/norobots.html>, a ponadto, w +przeciwieństwie do WWW::RobotRules, ten moduł nie analizuje podanej +nazwy przegladrki, a jedynie strukturę i zwraca tablicę asocjacyjną z +pełnym zestawem regułek. %prep %setup -q -n %{pdir}-%{pnam}-%{version} @@ -51,7 +52,6 @@ mv -f MANIFEST.mod MANIFEST %build -# Don't use pipes here: they generally don't work. Apply a patch. %{__perl} Makefile.PL \ INSTALLDIRS=vendor @@ -71,6 +71,7 @@ %files %defattr(644,root,root,755) %doc Changes README +%dir %{perl_vendorlib}/WWW/RobotRules %{perl_vendorlib}/WWW/RobotRules/Parser.pm %{_mandir}/man3/* @@ -80,6 +81,8 @@ All persons listed below can be reached at <cvs_login>@pld-linux.org $Log$ +Revision 1.2 2006/07/20 21:03:07 qboosh +- cleanups, missing dir + Revision 1.1 2006/07/12 20:51:08 agaran - initial, rel 0.2, builds here - ================================================================ ---- CVS-web: http://cvs.pld-linux.org/SPECS/perl-WWW-RobotRules-Parser.spec?r1=1.1&r2=1.2&f=u _______________________________________________ pld-cvs-commit mailing list [email protected] http://lists.pld-linux.org/mailman/listinfo/pld-cvs-commit
