Massimo has a script that blocks IPs of things it interprets as denial
of service attacks, including crawling content without obeying
robots.txt. I am building an experimental search engine at work and
accidentally bypassed robots.txt on one of the debug runs.  I've been
blocked ever since. Works fine from home. DNS resolves correctly from
work.

On Mar 9, 2:42 am, Yarko Tymciurak <resultsinsoftw...@gmail.com>
wrote:
> I see web2py.com responding nicely...   maybe something is blocking it
> for you?
>
> have a try athttp://68.169.39.35/
>
> Or try nslookup on web2py.com to see if your DNS server returns
> something valid.
>
> On Mar 9, 2:18 am, Sky <hmonfa...@gmail.com> wrote:
>
> > as I see web2py.com is down since 2 days ago.
> > is there any body to inform the site administrator ???

-- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To post to this group, send email to web...@googlegroups.com.
To unsubscribe from this group, send email to 
web2py+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/web2py?hl=en.

Reply via email to