On Wed, Aug 13, 2003 at 08:17:14AM -0700, Chris Shiflett wrote:
why do sites such as Google, Amazon, TicketMaster, Yahoo!, and eBay use
them?
The difficulty with cookies is when a site doesn't work without them.
Fortunately, Google works fine when cookies are disabled. Amazon is a
great
--- Analysis Solutions [EMAIL PROTECTED] wrote:
That's a good guess! Yet further proof that cookies suck, except
the ones made with flour, shortening and sugar, of course.
That's a pretty harsh description of one of Netscape's greatest contributions
to the Web (the other being SSL). With a
Hey Mike:
On Tue, Aug 12, 2003 at 07:33:31PM -0700, Mike Migurski wrote:
That's a good guess! Yet further proof that cookies suck, except the
ones made with flour, shortening and sugar, of course.
Huh? seems like further proof that cookies are working as intended:
serving up individual,
I have noticed that sometimes I cannot fopen($web_address,'r') or use
any similar files if the web address contains a form get in it. (i.e.
ends in a ?var1=xxxvar2=xxx...). I was wondering if there is any way
to spoof a get request to a web address? I looked at fsockopen and
think I need to
Hey Chris:
On Wed, Aug 13, 2003 at 08:44:40AM -0700, Chris Shiflett wrote:
--- Analysis Solutions [EMAIL PROTECTED] wrote:
The difficulty with cookies is when a site doesn't work without
them.
Right, which is an example of developers that suck, not the technology.
You're right. When I
Analysis Solutions wrote:
By that logic, you would think that PHP sucks, because there are plenty of
sucky PHP applications. :-)
I've had folks say things along these lines. Considering the number of
PHP related vulnerabilities showing up in Bugtraq/Security Focus
newsletter, PHP must
But some web pages when I cut and paste the URLs don't work. Like when
I search for something on Ebay. Could this be because of cookies?
That's a good guess! Yet further proof that cookies suck, except the
ones made with flour, shortening and sugar, of course.
Huh? seems like further proof
Take a look at cURL: http://www.php.net/manual/en/ref.curl.php
On Tue, 2003-08-12 at 20:13, Dan Anderson wrote:
I have noticed that sometimes I cannot fopen($web_address,'r') or use
any similar files if the web address contains a form get in it. (i.e.
ends in a ?var1=xxxvar2=xxx...). I was
Hey Dan:
On Tue, Aug 12, 2003 at 08:13:32PM -0400, Dan Anderson wrote:
I have noticed that sometimes I cannot fopen($web_address,'r') or use
any similar files if the web address contains a form get in it. (i.e.
ends in a ?var1=xxxvar2=xxx...).
It should work. You said sometimes. What are
Doesn't this have to be precompiled in?
-Dan
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php
Yeah, I believe on Apache/Linux, it needs to be compiled in, and on
windows there is just some configuration and file moving to do.
cURL will do what you need it to, there is even code on the PHP site for
making the POST happen.
Matt
On Tue, 2003-08-12 at 20:27, Dan Anderson wrote:
Doesn't
--- Analysis Solutions [EMAIL PROTECTED] wrote:
The difficulty with cookies is when a site doesn't work without
them.
Right, which is an example of developers that suck, not the technology.
By that logic, you would think that PHP sucks, because there are plenty of
sucky PHP applications. :-)
Hey Dan:
On Tue, Aug 12, 2003 at 09:12:45PM -0400, Dan Anderson wrote:
But some web pages when I cut and paste the URLs don't work. Like when
I search for something on Ebay. Could this be because of cookies?
That's a good guess! Yet further proof that cookies suck, except the ones
made
It should work. You said sometimes. What are the times it doesn't work?
I have a script to grab various info from web pages. Insert a web page
address and it outputs the results. It's useful for different things.
But some web pages when I cut and paste the URLs don't work. Like when
I
14 matches
Mail list logo