[us...@httpd] [Travel Assistance] Applications for ApacheCon EU 2009 - Now Open

2009-01-24 Thread Sander Temme



The Travel Assistance Committee is now accepting applications for those
wanting to attend ApacheCon EU 2009 between the 23rd and 27th March 2009
in Amsterdam.

The Travel Assistance Committee is looking for people who would like to
be able to attend ApacheCon EU 2009 who need some financial support in
order to get there. There are very few places available and the criteria
is high, that aside applications are open to all open source developers
who feel that their attendance would benefit themselves, their
project(s), the ASF or open source in general.

Financial assistance is available for travel, accommodation and entrance
fees either in full or in part, depending on circumstances. It is
intended that all our ApacheCon events are covered, so it may be prudent
for those in the United States or Asia to wait until an event closer to
them comes up - you are all welcome to apply for ApacheCon EU of course,
but there must be compelling reasons for you to attend an event further
away that your home location for your application to be considered above
those closer to the event location.

More information can be found on the main Apache website at
http://www.apache.org/travel/index.html - where you will also find a
link to the online application form.

Time is very tight for this event, so applications are open now and will
end on the 4th February 2009 - to give enough time for travel
arrangements to be made.

Good luck to all those that apply.


Regards,
The Travel Assistance Committee

-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscr...@httpd.apache.org
  "   from the digest: users-digest-unsubscr...@httpd.apache.org
For additional commands, e-mail: users-h...@httpd.apache.org



Re: [us...@httpd] robots.txt and rewrite rule

2009-01-24 Thread Eric Covener
On Sat, Jan 24, 2009 at 6:31 PM, André Warnier  wrote:

> Sorry to butt in, but is it not just the RewriteCond that is badly written ?

> So should
> RewriteCond $1 !=robots.txt
> not be
> RewriteCond %1 !=robots.txt

No, The logic in the RewriteCond referring back to the RewriteRule
backreference is just fine.

You might argue that referencing %{REQUEST_URI} is easier for humans
to grok in simpler (capture of the entire URI) cases. However, when
there is a relation between your Condition and your original capture,
it's nice to express it that way directly.

-- 
Eric Covener
cove...@gmail.com

-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscr...@httpd.apache.org
   "   from the digest: users-digest-unsubscr...@httpd.apache.org
For additional commands, e-mail: users-h...@httpd.apache.org



Re: [us...@httpd] robots.txt and rewrite rule

2009-01-24 Thread André Warnier

Norman Khine wrote:
[...]
Hi.
Sorry to butt in, but is it not just the RewriteCond that is badly written ?

From the Apache documentation :
(http://httpd.apache.org/docs/2.2/mod/mod_rewrite.html)

#  RewriteRule backreferences: These are backreferences of the form $N 
(0 <= N <= 9), which provide access to the grouped parts (in 
parentheses) of the pattern, from the RewriteRule which is subject to 
the current set of RewriteCond conditions..


# RewriteCond backreferences: These are backreferences of the form %N (1 
<= N <= 9), which provide access to the grouped parts (again, in 
parentheses) of the pattern, from the last matched RewriteCond in the 
current set of conditions.


So should
RewriteCond $1 !=robots.txt
not be
RewriteCond %1 !=robots.txt

or, maybe better because independent of the previous RewriteCond directives

RewriteCond %{REQUEST_URI} !"/robots.txt$"

?


-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscr...@httpd.apache.org
  "   from the digest: users-digest-unsubscr...@httpd.apache.org
For additional commands, e-mail: users-h...@httpd.apache.org



[us...@httpd] Re: UserDir + SetHandler cgi-script broken? (Apache 2.0.52)

2009-01-24 Thread Buck Golemon
Matt McCutchen  mattmccutchen.net> writes:

> 
> On Sat, 2009-01-24 at 09:17 -0500, Brian Mearns wrote:
> > On Fri, Jan 23, 2009 at 6:44 PM, Buck Golemon  
amd.com> wrote:
> > > Thanks for the reply, but if I remove the SetHandler directive above, it
> > > displays the file in plaintext just fine. It means both that the UserDir
> > > functions ok by itself, and that the  section above is being 
applied
> > > to my home dir.
> > 
> > Hm. About the only other thing I can think of is a permissions issue.
> > I assumed it would always execute scripts as the same user as apache,
> > but maybe for userdirs, it's switching users?
> 
> According to the docs, all CGI scripts in userdirs switch users using
> suexec:
> 
> http://httpd.apache.org/docs/2.0/suexec.html#usage
> 
> Suexec is pretty picky about permissions, so check the suexec_log file
> for any relevant errors.
> 

Thanks! This could well be the issue. I saw all the caveats about suexec, 
but skimmed it because I thought I wasn't using it. I'll check it when I
get back to work.
--Buck



-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscr...@httpd.apache.org
   "   from the digest: users-digest-unsubscr...@httpd.apache.org
For additional commands, e-mail: users-h...@httpd.apache.org



Re: [us...@httpd] Firewall causing ProxyPass to fail

2009-01-24 Thread Raj Jay
Thanks Eric! This was helpful.

ProxyRemote works fine when using http with the remote server.

For https requests I get a 502. The ProxyRemote document states that: "only
"http" is supported by this module". Is there any work around for https?

Regards,
-Raj.

On Fri, Jan 23, 2009 at 8:59 PM, Eric Covener  wrote:

> On Fri, Jan 23, 2009 at 6:07 PM, Raj Jay  wrote:
> > Hi,
> >
> > I am trying to use the ProxyPass directive in my apache cofig file to
> work
> > around the ajax cross-domain policy.
> >
> > My apache server is hosted behind a corporate firewall. The ProxyPass
> > directive works fine when the remote server resides within the intranet;
> > however, when it points to something outside the intranet, I get a HTTP
> 503.
> >
> > It seems like the apache server is not using the http_proxy environment
> > variable.  I also tried to set it explicitlly in my config file as
> follows:
> > SetEnv http_proxy ...
> > SetEnv https_proxy ...
>
> See the ProxyRemote directive
>
> --
> Eric Covener
> cove...@gmail.com
>
> -
> The official User-To-User support forum of the Apache HTTP Server Project.
> See http://httpd.apache.org/userslist.html> for more info.
> To unsubscribe, e-mail: users-unsubscr...@httpd.apache.org
>   "   from the digest: users-digest-unsubscr...@httpd.apache.org
> For additional commands, e-mail: users-h...@httpd.apache.org
>
>


Re: [us...@httpd] Re: UserDir + SetHandler cgi-script broken? (Apache 2.0.52)

2009-01-24 Thread Matt McCutchen
On Sat, 2009-01-24 at 09:17 -0500, Brian Mearns wrote:
> On Fri, Jan 23, 2009 at 6:44 PM, Buck Golemon  wrote:
> > > Thanks for the reply, but if I remove the SetHandler directive above, it
> > displays the file in plaintext just fine. It means both that the UserDir
> > functions ok by itself, and that the  section above is being 
> > applied
> > to my home dir.
> 
> Hm. About the only other thing I can think of is a permissions issue.
> I assumed it would always execute scripts as the same user as apache,
> but maybe for userdirs, it's switching users?

According to the docs, all CGI scripts in userdirs switch users using
suexec:

http://httpd.apache.org/docs/2.0/suexec.html#usage

Suexec is pretty picky about permissions, so check the suexec_log file
for any relevant errors.

-- 
Matt


-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscr...@httpd.apache.org
   "   from the digest: users-digest-unsubscr...@httpd.apache.org
For additional commands, e-mail: users-h...@httpd.apache.org



Re: [us...@httpd] robots.txt and rewrite rule

2009-01-24 Thread Norman Khine

Hi,


Eric Covener wrote:

On Sat, Jan 24, 2009 at 10:20 AM, Norman Khine  wrote:

[Sat Jan 24 18:46:57 2009] [error] [client 86.219.32.244] client denied by
server configuration: /usr/htdocs


You don't have a  that allows you to serve
static files out of the filesystem.  If this is a new DocumentRoot you
added, copy the  stanza from your original DocumentRoot, or
see the default conf.

minimally:

order allow,deny
allow from all




I have this in my vhost entry:


Options Indexes FollowSymLinks
AllowOverride None
Order allow,deny
Allow from all



where the "/var/www/localhost/htdocs" is the root of the apache install 
files i.e.


http://my.ip/robots.txt - works

http://mysite-before-rewrite/robots.txt - gets a 403 error

Should I add the /usr/htdocs folder?

I just tried to sym link it, and got this error:

[Sat Jan 24 19:22:04 2009] [error] [client 86.219.32.244] client denied 
by server configuration: /usr/htdocs/robots.txt
[Sat Jan 24 19:22:07 2009] [error] [client 86.219.32.244] client denied 
by server configuration: /usr/htdocs/robots.txt


# ls -al /usr
lrwxrwxrwx   1 rootroot   25 Jan 24 19:21 htdocs -> 
/var/www/localhost/htdocs


-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscr...@httpd.apache.org
  "   from the digest: users-digest-unsubscr...@httpd.apache.org
For additional commands, e-mail: users-h...@httpd.apache.org



Re: [us...@httpd] robots.txt and rewrite rule

2009-01-24 Thread Eric Covener
On Sat, Jan 24, 2009 at 10:20 AM, Norman Khine  wrote:
> [Sat Jan 24 18:46:57 2009] [error] [client 86.219.32.244] client denied by
> server configuration: /usr/htdocs

You don't have a  that allows you to serve
static files out of the filesystem.  If this is a new DocumentRoot you
added, copy the  stanza from your original DocumentRoot, or
see the default conf.

minimally:

order allow,deny
allow from all


-- 
Eric Covener
cove...@gmail.com

-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscr...@httpd.apache.org
   "   from the digest: users-digest-unsubscr...@httpd.apache.org
For additional commands, e-mail: users-h...@httpd.apache.org



Re: [us...@httpd] robots.txt and rewrite rule

2009-01-24 Thread Norman Khine



Bob Ionescu wrote:

2009/1/23 Norman Khine :

RewriteEngine On
#DenyHosts Rules
RewriteMap  hosts-deny txt:/home/user/txt/hosts.deny
RewriteCond ${hosts-deny:%{REMOTE_HOST}|NOT-FOUND} !=NOT-FOUND [OR]
RewriteCond ${hosts-deny:%{REMOTE_ADDR}|NOT-FOUND} !=NOT-FOUND [OR]
RewriteCond ${hosts-deny:%{HTTP:true-client-ip}|NOT-FOUND} !=NOT-FOUND
RewriteCond $1 !=robots.txt
RewriteRule ^/.*  -  [F]


That's the wrong rule, it should be placed above the rule which proxies, i.e.

RewriteCond $1 !=robots.txt
RewriteRule ^/(.*) http://localhost:12080/companies/$1 [P]


Thanks, I corrected this, but now I get a 403 Forbidden page

If I access the http://IP/robots.txt it works fine.

$cat rewrite.log

86.219.32.244 - - [24/Jan/2009:18:46:57 +0100] 
[domain.com/sid#81ce720][rid#8240688/initial] (4) RewriteCond: 
input='NOT-FOUND' pattern='!=NOT-FOUND' => not-matched


86.219.32.244 - - [24/Jan/2009:18:46:57 +0100] 
[domain.com/sid#81ce720][rid#8240688/initial] (3) applying pattern 
'^/(.*)' to uri '/robots.txt'


86.219.32.244 - - [24/Jan/2009:18:46:57 +0100] 
[domain.com/sid#81ce720][rid#8240688/initial] (4) RewriteCond: 
input='robots.txt' pattern='!=robots.txt' => not-matched


86.219.32.244 - - [24/Jan/2009:18:46:57 +0100] 
[domain.com/sid#81ce720][rid#8240688/initial] (1) pass through /robots.txt



$cat error.log
[Thu Jan 22 18:23:51 2009] [error] (111)Connection refused: proxy: HTTP: 
attempt to connect to [::1]:12081 (*) failed



[Fri Jan 23 20:45:26 2009] [error] (111)Connection refused: proxy: HTTP: 
attempt to connect to [::1]:12080 (*) failed


[Sat Jan 24 18:42:53 2009] [error] [client 86.219.32.244] client denied 
by server configuration: /usr/htdocs


[Sat Jan 24 18:43:34 2009] [error] [client 86.219.32.244] client denied 
by server configuration: /usr/htdocs[Sat Jan 24 18:44:57 2009] [error] 
[client 86.219.32.244] client denied by server configuration: /usr/htdocs


[Sat Jan 24 18:46:57 2009] [error] [client 86.219.32.244] client denied 
by server configuration: /usr/htdocs



Cheers

Norman

-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscr...@httpd.apache.org
  "   from the digest: users-digest-unsubscr...@httpd.apache.org
For additional commands, e-mail: users-h...@httpd.apache.org



Re: [us...@httpd] Re: UserDir + SetHandler cgi-script broken? (Apache 2.0.52)

2009-01-24 Thread Brian Mearns
On Fri, Jan 23, 2009 at 6:44 PM, Buck Golemon  wrote:
> Brian Mearns  gmail.com> writes:
>
>>
>> On Thu, Jan 22, 2009 at 8:16 PM, Buck Golemon  amd.com>
> wrote:
>> > works just fine:
>> > http://pdweb.ca.atitech.com/beg/foo.sh
>> >
>> > doesn't work:
>> > http://pdweb.ca.atitech.com/~bgolemon/foo.sh
>> >
>> >
>> > Here's the relevant configuration.
>> > 
>> >UserDir public_html
>> >UserDir disabled root
>> > 
>> > Alias /beg/ "/user/bgolemon/public_html/"
>> > 
>> >Options ExecCGI
>> >SetHandler cgi-script
>> >AllowOverride None
>> >Allow from all
>> >Order allow,deny
>> > 
>> >
> ...
>> >
>> >
>> > Are there any known issues with this? How can I get this to work? This
> makes me
>> > feel like either the cgi-script handler or the UserDir module is broken.
>> >
>> > Thanks in advance,
>> > --Buck
>>
>> Are you able to access anything in your userdir?
>>
>> You might need another Directory tag for "~bgolemon".
>> -Brian
>>
>
> Thanks for the reply, but if I remove the SetHandler directive above, it
> displays the file in plaintext just fine. It means both that the UserDir
> functions ok by itself, and that the  section above is being 
> applied
> to my home dir.
>
> --Buck

Hm. About the only other thing I can think of is a permissions issue.
I assumed it would always execute scripts as the same user as apache,
but maybe for userdirs, it's switching users? Sorry, can't think of
anything else that might cause this.

-Brian

-- 
Feel free to contact me using PGP Encryption:
Key Id: 0x3AA70848
Available from: http://pgp.mit.edu/

-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscr...@httpd.apache.org
   "   from the digest: users-digest-unsubscr...@httpd.apache.org
For additional commands, e-mail: users-h...@httpd.apache.org



Re: [us...@httpd] content handler question

2009-01-24 Thread André Warnier

anson ho wrote:

This is one of the possible solutions that I was thinking. But it
seems it will make the envirnoment more complex and even worse in a
cluster environment. I am thinking if it is possible to do something
like mod_headers. But first I need to ensure that I can read the
existing headers in my own module. If the mod_headers liked approach
don't work, I will try the proxy solution. Or anyone has other great
alternative?

mod_perl will allow you to do just about anything you want to HTTP 
headers.  Would that be an option ? (http://perl.apache.org)


-
The official User-To-User support forum of the Apache HTTP Server Project.
See http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscr...@httpd.apache.org
  "   from the digest: users-digest-unsubscr...@httpd.apache.org
For additional commands, e-mail: users-h...@httpd.apache.org