The reason I don't use the crawler as an example is that people then start
to rationalize "well it's an internal app only..."
But corporate firewalls, airports, mobile devices, etc. are fairly common,
and can be in front of even internal applications. It's not about "making
the web happy", it's about following the same rules that the rest of the
web's infrastructure uses.

-- Yehuda

On Thu, Jan 8, 2009 at 8:17 AM, Michael D. Ivey <[email protected]>wrote:

>
> On Jan 8, 2009, at 10:04 AM, Jim Freeze wrote:
> > I think specifically the problem here (and the real reason this should
> > not be done) is that a crawler (using get requests) can actually
> > delete an object.
> > To me, this is the real danger and the real issue and the reason for
> > avoiding this type of web coding. Not because "it's evil", or "it will
> > make the web unhappy" or "Matt won't be my friend". ;-)
>
> That's why it's evil and that's why it makes the web unhappy. They're
> the same thing.
>
> >
>


-- 
Yehuda Katz
Developer | Engine Yard
(ph) 718.877.1325

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"merb" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [email protected]
For more options, visit this group at http://groups.google.com/group/merb?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to