Re: [PHP] high traffic websites

2013-09-19 Thread Negin Nickparsa
it may be helpful for someone.
I liked GTmetrix kinda helpful and magic. http://gtmetrix.com/#!


Sincerely
Negin Nickparsa


On Wed, Sep 18, 2013 at 4:42 PM, Sebastian Krebs krebs@gmail.comwrote:

 2013/9/18 Camilo Sperberg unrea...@gmail.com

 
  On Sep 18, 2013, at 14:26, Haluk Karamete halukkaram...@gmail.com
 wrote:
 
   I recommend OPCache, which is already included in PHP 5.5.
  
   Camilo,
   I'm just curious about the disadvantageous aspects of OPcache.
  
   My logic says there must be some issues with it otherwise it would
  have
  come already enabled.
  
   Sent from iPhone
  
  
   On Sep 18, 2013, at 2:20 AM, Camilo Sperberg unrea...@gmail.com
 wrote:
  
  
   On Sep 18, 2013, at 09:38, Negin Nickparsa nickpa...@gmail.com
 wrote:
  
   Thank you Sebastian..actually I will already have one if qualified
 for
  the
   job. Yes, and I may fail to handle it that's why I asked for
 guidance.
   I wanted some tidbits to start over. I have searched through yslow,
   HTTtrack and others.
   I have searched through php list in my email too before asking this
   question. it is kind of beneficial for all people and not has been
  asked
   directly.
  
  
   Sincerely
   Negin Nickparsa
  
  
   On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs 
 krebs@gmail.com
  wrote:
  
  
  
  
   2013/9/18 Negin Nickparsa nickpa...@gmail.com
  
   In general, what are the best ways to handle high traffic websites?
  
   VPS(clouds)?
   web analyzers?
   dedicated servers?
   distributed memory cache?
  
  
   Yes :)
  
   But seriously: That is a topic most of us spent much time to get
 into
  it.
   You can explain it with a bunch of buzzwords. Additional, how do you
  define
   high traffic websites? Do you already _have_ such a site? Or do
 you
   _want_ it? It's important, because I've seen it far too often, that
   projects spent too much effort in their high traffic
 infrastructure
  and
   at the end it wasn't that high traffic ;) I wont say, that you
 cannot
  be
   successfull, but you should start with an effort you can handle.
  
   Regards,
   Sebastian
  
  
  
  
   Sincerely
   Negin Nickparsa
  
  
  
  
   --
   github.com/KingCrunch
  
  
   Your question is way too vague to be answered properly... My best
 guess
  would be that it depends severely on the type of website you have and
 how's
  the current implementation being well... implemented.
  
   Simply said: what works for Facebook may/will not work for linkedIn,
  twitter or Google, mainly because the type of search differs A LOT:
  facebook is about relations between people, twitter is about small pieces
  of data not mainly interconnected between each other, while Google is all
  about links and all type of content: from little pieces of information
  through whole Wikipedia.
  
   You could start by studying how varnish and redis/memcached works, you
  could study about how proxies work (nginx et al), CDNs and that kind of
  stuff, but if you want more specific answers, you could better ask
 specific
  question.
  
   In the PHP area, an opcode cache does the job very well and can
  accelerate the page load by several orders of magnitude, I recommend
  OPCache, which is already included in PHP 5.5.
  
   Greetings.
  
  
   --
   PHP General Mailing List (http://www.php.net/)
   To unsubscribe, visit: http://www.php.net/unsub.php
  
 
 
  The original RFC states:
 
  https://wiki.php.net/rfc/optimizerplus
  The integration proposed for PHP 5.5.0 is mostly 'soft' integration. That
  means that there'll be no tight coupling between Optimizer+ and PHP;
 Those
  who wish to use another opcode cache will be able to do so, by not
 loading
  Optimizer+ and loading another opcode cache instead. As per the Suggested
  Roadmap above, we might want to review this decision in the future; There
  might be room for further performance or functionality gains from tighter
  integration; None are known at this point, and they're beyond the scope
 of
  this RFC.
 
  So that's why OPCache isn't enabled by default in PHP 5.5
 


 Also worth to mention, that it is the first release with an opcode-cache
 integrated. Giving the other some release to get used to it, sounds useful
 :)


 
  Greetings.
 
 
  --
  PHP General Mailing List (http://www.php.net/)
  To unsubscribe, visit: http://www.php.net/unsub.php
 
 


 --
 github.com/KingCrunch



[PHP] high traffic websites

2013-09-18 Thread Negin Nickparsa
In general, what are the best ways to handle high traffic websites?

VPS(clouds)?
web analyzers?
dedicated servers?
distributed memory cache?


Sincerely
Negin Nickparsa


Re: [PHP] high traffic websites

2013-09-18 Thread Sebastian Krebs
2013/9/18 Negin Nickparsa nickpa...@gmail.com

 In general, what are the best ways to handle high traffic websites?

 VPS(clouds)?
 web analyzers?
 dedicated servers?
 distributed memory cache?


Yes :)

But seriously: That is a topic most of us spent much time to get into it.
You can explain it with a bunch of buzzwords. Additional, how do you define
high traffic websites? Do you already _have_ such a site? Or do you
_want_ it? It's important, because I've seen it far too often, that
projects spent too much effort in their high traffic infrastructure and
at the end it wasn't that high traffic ;) I wont say, that you cannot be
successfull, but you should start with an effort you can handle.

Regards,
Sebastian




 Sincerely
 Negin Nickparsa




-- 
github.com/KingCrunch


Re: [PHP] high traffic websites

2013-09-18 Thread Negin Nickparsa
Thank you Sebastian..actually I will already have one if qualified for the
job. Yes, and I may fail to handle it that's why I asked for guidance.
I wanted some tidbits to start over. I have searched through yslow,
HTTtrack and others.
I have searched through php list in my email too before asking this
question. it is kind of beneficial for all people and not has been asked
directly.


Sincerely
Negin Nickparsa


On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs krebs@gmail.comwrote:




 2013/9/18 Negin Nickparsa nickpa...@gmail.com

 In general, what are the best ways to handle high traffic websites?

 VPS(clouds)?
 web analyzers?
 dedicated servers?
 distributed memory cache?


 Yes :)

 But seriously: That is a topic most of us spent much time to get into it.
 You can explain it with a bunch of buzzwords. Additional, how do you define
 high traffic websites? Do you already _have_ such a site? Or do you
 _want_ it? It's important, because I've seen it far too often, that
 projects spent too much effort in their high traffic infrastructure and
 at the end it wasn't that high traffic ;) I wont say, that you cannot be
 successfull, but you should start with an effort you can handle.

 Regards,
 Sebastian




 Sincerely
 Negin Nickparsa




 --
 github.com/KingCrunch



Re: [PHP] high traffic websites

2013-09-18 Thread Camilo Sperberg

On Sep 18, 2013, at 09:38, Negin Nickparsa nickpa...@gmail.com wrote:

 Thank you Sebastian..actually I will already have one if qualified for the
 job. Yes, and I may fail to handle it that's why I asked for guidance.
 I wanted some tidbits to start over. I have searched through yslow,
 HTTtrack and others.
 I have searched through php list in my email too before asking this
 question. it is kind of beneficial for all people and not has been asked
 directly.
 
 
 Sincerely
 Negin Nickparsa
 
 
 On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs krebs@gmail.comwrote:
 
 
 
 
 2013/9/18 Negin Nickparsa nickpa...@gmail.com
 
 In general, what are the best ways to handle high traffic websites?
 
 VPS(clouds)?
 web analyzers?
 dedicated servers?
 distributed memory cache?
 
 
 Yes :)
 
 But seriously: That is a topic most of us spent much time to get into it.
 You can explain it with a bunch of buzzwords. Additional, how do you define
 high traffic websites? Do you already _have_ such a site? Or do you
 _want_ it? It's important, because I've seen it far too often, that
 projects spent too much effort in their high traffic infrastructure and
 at the end it wasn't that high traffic ;) I wont say, that you cannot be
 successfull, but you should start with an effort you can handle.
 
 Regards,
 Sebastian
 
 
 
 
 Sincerely
 Negin Nickparsa
 
 
 
 
 --
 github.com/KingCrunch
 

Your question is way too vague to be answered properly... My best guess would 
be that it depends severely on the type of website you have and how's the 
current implementation being well... implemented.

Simply said: what works for Facebook may/will not work for linkedIn, twitter or 
Google, mainly because the type of search differs A LOT: facebook is about 
relations between people, twitter is about small pieces of data not mainly 
interconnected between each other, while Google is all about links and all type 
of content: from little pieces of information through whole Wikipedia.

You could start by studying how varnish and redis/memcached works, you could 
study about how proxies work (nginx et al), CDNs and that kind of stuff, but if 
you want more specific answers, you could better ask specific question.

In the PHP area, an opcode cache does the job very well and can accelerate the 
page load by several orders of magnitude, I recommend OPCache, which is already 
included in PHP 5.5.

Greetings.


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] high traffic websites

2013-09-18 Thread Negin Nickparsa
Thank you Camilo

to be more in details,suppose the website has 80,000 users and each page
takes 200 ms to be rendered and you have thousand hits in a second so we
want to reduce the time of rendering. is there any way to reduce the
rendering time?

other thing is suppose they want to upload files simultaneously and the
videos are in the website not on another server like YouTube and so streams
are really consuming the bandwidth.

Also,It is troublesome to get backups,when getting backups you have problem
of lock backing up with bulk of data.



Sincerely
Negin Nickparsa


On Wed, Sep 18, 2013 at 12:50 PM, Camilo Sperberg unrea...@gmail.comwrote:


 On Sep 18, 2013, at 09:38, Negin Nickparsa nickpa...@gmail.com wrote:

  Thank you Sebastian..actually I will already have one if qualified for
 the
  job. Yes, and I may fail to handle it that's why I asked for guidance.
  I wanted some tidbits to start over. I have searched through yslow,
  HTTtrack and others.
  I have searched through php list in my email too before asking this
  question. it is kind of beneficial for all people and not has been asked
  directly.
 
 
  Sincerely
  Negin Nickparsa
 
 
  On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs krebs@gmail.com
 wrote:
 
 
 
 
  2013/9/18 Negin Nickparsa nickpa...@gmail.com
 
  In general, what are the best ways to handle high traffic websites?
 
  VPS(clouds)?
  web analyzers?
  dedicated servers?
  distributed memory cache?
 
 
  Yes :)
 
  But seriously: That is a topic most of us spent much time to get into
 it.
  You can explain it with a bunch of buzzwords. Additional, how do you
 define
  high traffic websites? Do you already _have_ such a site? Or do you
  _want_ it? It's important, because I've seen it far too often, that
  projects spent too much effort in their high traffic infrastructure
 and
  at the end it wasn't that high traffic ;) I wont say, that you cannot be
  successfull, but you should start with an effort you can handle.
 
  Regards,
  Sebastian
 
 
 
 
  Sincerely
  Negin Nickparsa
 
 
 
 
  --
  github.com/KingCrunch
 

 Your question is way too vague to be answered properly... My best guess
 would be that it depends severely on the type of website you have and how's
 the current implementation being well... implemented.

 Simply said: what works for Facebook may/will not work for linkedIn,
 twitter or Google, mainly because the type of search differs A LOT:
 facebook is about relations between people, twitter is about small pieces
 of data not mainly interconnected between each other, while Google is all
 about links and all type of content: from little pieces of information
 through whole Wikipedia.

 You could start by studying how varnish and redis/memcached works, you
 could study about how proxies work (nginx et al), CDNs and that kind of
 stuff, but if you want more specific answers, you could better ask specific
 question.

 In the PHP area, an opcode cache does the job very well and can accelerate
 the page load by several orders of magnitude, I recommend OPCache, which is
 already included in PHP 5.5.

 Greetings.




Re: [PHP] high traffic websites

2013-09-18 Thread Sebastian Krebs
2013/9/18 Negin Nickparsa nickpa...@gmail.com

 Thank you Camilo

 to be more in details,suppose the website has 80,000 users and each page
 takes 200 ms to be rendered and you have thousand hits in a second so we
 want to reduce the time of rendering. is there any way to reduce the
 rendering time?


Read about frontend-/proxy-caching (Nginx, Varnish) and ESI/SSI-include
(also NGinx and Varnish ;)). The idea is simply If you don't have to
process on every request in the backend, don't process it in the backend on
every request.

But maybe you mixed up some words, because the rendering time is the time
consumed by the renderer within the browser (HTML and CSS). This you can
improve, if you improve your HTML/CSS :)


I am a little bit curious: Do you _really_ have 1000 requests/second, or do
you just throw some numbers in? ;)



 other thing is suppose they want to upload files simultaneously and the
 videos are in the website not on another server like YouTube and so streams
 are really consuming the bandwidth.


Well, if there are streams, there are streams. I cannot imagine, that there
is another way someone can stream a video without downloading it.



 Also,It is troublesome to get backups,when getting backups you have
 problem of lock backing up with bulk of data.


Even in times, where there is not that much traffix? Automatic backup at
3:00 in the morning for example?





 Sincerely
 Negin Nickparsa


 On Wed, Sep 18, 2013 at 12:50 PM, Camilo Sperberg unrea...@gmail.comwrote:


 On Sep 18, 2013, at 09:38, Negin Nickparsa nickpa...@gmail.com wrote:

  Thank you Sebastian..actually I will already have one if qualified for
 the
  job. Yes, and I may fail to handle it that's why I asked for guidance.
  I wanted some tidbits to start over. I have searched through yslow,
  HTTtrack and others.
  I have searched through php list in my email too before asking this
  question. it is kind of beneficial for all people and not has been asked
  directly.
 
 
  Sincerely
  Negin Nickparsa
 
 
  On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs krebs@gmail.com
 wrote:
 
 
 
 
  2013/9/18 Negin Nickparsa nickpa...@gmail.com
 
  In general, what are the best ways to handle high traffic websites?
 
  VPS(clouds)?
  web analyzers?
  dedicated servers?
  distributed memory cache?
 
 
  Yes :)
 
  But seriously: That is a topic most of us spent much time to get into
 it.
  You can explain it with a bunch of buzzwords. Additional, how do you
 define
  high traffic websites? Do you already _have_ such a site? Or do you
  _want_ it? It's important, because I've seen it far too often, that
  projects spent too much effort in their high traffic infrastructure
 and
  at the end it wasn't that high traffic ;) I wont say, that you cannot
 be
  successfull, but you should start with an effort you can handle.
 
  Regards,
  Sebastian
 
 
 
 
  Sincerely
  Negin Nickparsa
 
 
 
 
  --
  github.com/KingCrunch
 

 Your question is way too vague to be answered properly... My best guess
 would be that it depends severely on the type of website you have and how's
 the current implementation being well... implemented.

 Simply said: what works for Facebook may/will not work for linkedIn,
 twitter or Google, mainly because the type of search differs A LOT:
 facebook is about relations between people, twitter is about small pieces
 of data not mainly interconnected between each other, while Google is all
 about links and all type of content: from little pieces of information
 through whole Wikipedia.

 You could start by studying how varnish and redis/memcached works, you
 could study about how proxies work (nginx et al), CDNs and that kind of
 stuff, but if you want more specific answers, you could better ask specific
 question.

 In the PHP area, an opcode cache does the job very well and can
 accelerate the page load by several orders of magnitude, I recommend
 OPCache, which is already included in PHP 5.5.

 Greetings.





-- 
github.com/KingCrunch


Re: [PHP] high traffic websites

2013-09-18 Thread Stuart Dallas
On 18 Sep 2013, at 12:50, Negin Nickparsa nickpa...@gmail.com wrote:

 to be more in details,suppose the website has 80,000 users and each page
 takes 200 ms to be rendered and you have thousand hits in a second so we
 want to reduce the time of rendering. is there any way to reduce the
 rendering time?
 
 other thing is suppose they want to upload files simultaneously and the
 videos are in the website not on another server like YouTube and so streams
 are really consuming the bandwidth.
 
 Also,It is troublesome to get backups,when getting backups you have problem
 of lock backing up with bulk of data.

Your question is impossible to answer efficiently without profiling. You need 
to know what PHP is doing in those 200ms before you can target your 
optimisations for maximum effect.

I use xdebug to produce trace files. From there I can see exactly what is 
taking the most amount of time, and then I can look in to how to make that 
thing faster. When I'm certain there is no faster way to do what it's doing I 
move on to the next biggest thing.

Of course there are generic things you should do such as adding an opcode cache 
and looking at your server setup, but targeted optimisation is far better than 
trying generic stuff.

-Stuart

-- 
Stuart Dallas
3ft9 Ltd
http://3ft9.com/

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] high traffic websites

2013-09-18 Thread Negin Nickparsa
I am a little bit curious: Do you _really_ have 1000 requests/second, or do
you just throw some numbers in? ;)

Sebastian, supposedly_asking_to_get_some_pre_evaluation :)

Even in times, where there is not that much traffix? Automatic backup at
3:00 in the morning for example?

3:00 morning in one country is 9 Am in other country, 3 PM in other country
.

By the way Thank you so much guys, I wanted tidbits and you gave me more.

Stuart, I recall your replies in other situations and always you helped me
to improve.list is happy to have you.

Sincerely
Negin Nickparsa


On Wed, Sep 18, 2013 at 3:39 PM, Sebastian Krebs krebs@gmail.comwrote:




 2013/9/18 Negin Nickparsa nickpa...@gmail.com

 Thank you Camilo

 to be more in details,suppose the website has 80,000 users and each page
 takes 200 ms to be rendered and you have thousand hits in a second so we
 want to reduce the time of rendering. is there any way to reduce the
 rendering time?


 Read about frontend-/proxy-caching (Nginx, Varnish) and ESI/SSI-include
 (also NGinx and Varnish ;)). The idea is simply If you don't have to
 process on every request in the backend, don't process it in the backend on
 every request.

 But maybe you mixed up some words, because the rendering time is the time
 consumed by the renderer within the browser (HTML and CSS). This you can
 improve, if you improve your HTML/CSS :)


 I am a little bit curious: Do you _really_ have 1000 requests/second, or
 do you just throw some numbers in? ;)



 other thing is suppose they want to upload files simultaneously and the
 videos are in the website not on another server like YouTube and so streams
 are really consuming the bandwidth.


 Well, if there are streams, there are streams. I cannot imagine, that
 there is another way someone can stream a video without downloading it.



 Also,It is troublesome to get backups,when getting backups you have
 problem of lock backing up with bulk of data.


 Even in times, where there is not that much traffix? Automatic backup at
 3:00 in the morning for example?





 Sincerely
 Negin Nickparsa


 On Wed, Sep 18, 2013 at 12:50 PM, Camilo Sperberg unrea...@gmail.comwrote:


 On Sep 18, 2013, at 09:38, Negin Nickparsa nickpa...@gmail.com wrote:

  Thank you Sebastian..actually I will already have one if qualified for
 the
  job. Yes, and I may fail to handle it that's why I asked for guidance.
  I wanted some tidbits to start over. I have searched through yslow,
  HTTtrack and others.
  I have searched through php list in my email too before asking this
  question. it is kind of beneficial for all people and not has been
 asked
  directly.
 
 
  Sincerely
  Negin Nickparsa
 
 
  On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs krebs@gmail.com
 wrote:
 
 
 
 
  2013/9/18 Negin Nickparsa nickpa...@gmail.com
 
  In general, what are the best ways to handle high traffic websites?
 
  VPS(clouds)?
  web analyzers?
  dedicated servers?
  distributed memory cache?
 
 
  Yes :)
 
  But seriously: That is a topic most of us spent much time to get into
 it.
  You can explain it with a bunch of buzzwords. Additional, how do you
 define
  high traffic websites? Do you already _have_ such a site? Or do you
  _want_ it? It's important, because I've seen it far too often, that
  projects spent too much effort in their high traffic infrastructure
 and
  at the end it wasn't that high traffic ;) I wont say, that you cannot
 be
  successfull, but you should start with an effort you can handle.
 
  Regards,
  Sebastian
 
 
 
 
  Sincerely
  Negin Nickparsa
 
 
 
 
  --
  github.com/KingCrunch
 

 Your question is way too vague to be answered properly... My best guess
 would be that it depends severely on the type of website you have and how's
 the current implementation being well... implemented.

 Simply said: what works for Facebook may/will not work for linkedIn,
 twitter or Google, mainly because the type of search differs A LOT:
 facebook is about relations between people, twitter is about small pieces
 of data not mainly interconnected between each other, while Google is all
 about links and all type of content: from little pieces of information
 through whole Wikipedia.

 You could start by studying how varnish and redis/memcached works, you
 could study about how proxies work (nginx et al), CDNs and that kind of
 stuff, but if you want more specific answers, you could better ask specific
 question.

 In the PHP area, an opcode cache does the job very well and can
 accelerate the page load by several orders of magnitude, I recommend
 OPCache, which is already included in PHP 5.5.

 Greetings.





 --
 github.com/KingCrunch



Re: [PHP] high traffic websites

2013-09-18 Thread Camilo Sperberg

On Sep 18, 2013, at 14:26, Haluk Karamete halukkaram...@gmail.com wrote:

 I recommend OPCache, which is already included in PHP 5.5.
 
 Camilo,
 I'm just curious about the disadvantageous aspects of OPcache. 
 
 My logic says there must be some issues with it otherwise it would  have come 
 already enabled.   
 
 Sent from iPhone 
 
 
 On Sep 18, 2013, at 2:20 AM, Camilo Sperberg unrea...@gmail.com wrote:
 
 
 On Sep 18, 2013, at 09:38, Negin Nickparsa nickpa...@gmail.com wrote:
 
 Thank you Sebastian..actually I will already have one if qualified for the
 job. Yes, and I may fail to handle it that's why I asked for guidance.
 I wanted some tidbits to start over. I have searched through yslow,
 HTTtrack and others.
 I have searched through php list in my email too before asking this
 question. it is kind of beneficial for all people and not has been asked
 directly.
 
 
 Sincerely
 Negin Nickparsa
 
 
 On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs 
 krebs@gmail.comwrote:
 
 
 
 
 2013/9/18 Negin Nickparsa nickpa...@gmail.com
 
 In general, what are the best ways to handle high traffic websites?
 
 VPS(clouds)?
 web analyzers?
 dedicated servers?
 distributed memory cache?
 
 
 Yes :)
 
 But seriously: That is a topic most of us spent much time to get into it.
 You can explain it with a bunch of buzzwords. Additional, how do you define
 high traffic websites? Do you already _have_ such a site? Or do you
 _want_ it? It's important, because I've seen it far too often, that
 projects spent too much effort in their high traffic infrastructure and
 at the end it wasn't that high traffic ;) I wont say, that you cannot be
 successfull, but you should start with an effort you can handle.
 
 Regards,
 Sebastian
 
 
 
 
 Sincerely
 Negin Nickparsa
 
 
 
 
 --
 github.com/KingCrunch
 
 
 Your question is way too vague to be answered properly... My best guess 
 would be that it depends severely on the type of website you have and how's 
 the current implementation being well... implemented.
 
 Simply said: what works for Facebook may/will not work for linkedIn, twitter 
 or Google, mainly because the type of search differs A LOT: facebook is 
 about relations between people, twitter is about small pieces of data not 
 mainly interconnected between each other, while Google is all about links 
 and all type of content: from little pieces of information through whole 
 Wikipedia.
 
 You could start by studying how varnish and redis/memcached works, you could 
 study about how proxies work (nginx et al), CDNs and that kind of stuff, but 
 if you want more specific answers, you could better ask specific question.
 
 In the PHP area, an opcode cache does the job very well and can accelerate 
 the page load by several orders of magnitude, I recommend OPCache, which is 
 already included in PHP 5.5.
 
 Greetings.
 
 
 -- 
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 


The original RFC states: 

https://wiki.php.net/rfc/optimizerplus
The integration proposed for PHP 5.5.0 is mostly 'soft' integration. That means 
that there'll be no tight coupling between Optimizer+ and PHP; Those who wish 
to use another opcode cache will be able to do so, by not loading Optimizer+ 
and loading another opcode cache instead. As per the Suggested Roadmap above, 
we might want to review this decision in the future; There might be room for 
further performance or functionality gains from tighter integration; None are 
known at this point, and they're beyond the scope of this RFC.

So that's why OPCache isn't enabled by default in PHP 5.5

Greetings.


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] high traffic websites

2013-09-18 Thread Sebastian Krebs
2013/9/18 Camilo Sperberg unrea...@gmail.com


 On Sep 18, 2013, at 14:26, Haluk Karamete halukkaram...@gmail.com wrote:

  I recommend OPCache, which is already included in PHP 5.5.
 
  Camilo,
  I'm just curious about the disadvantageous aspects of OPcache.
 
  My logic says there must be some issues with it otherwise it would  have
 come already enabled.
 
  Sent from iPhone
 
 
  On Sep 18, 2013, at 2:20 AM, Camilo Sperberg unrea...@gmail.com wrote:
 
 
  On Sep 18, 2013, at 09:38, Negin Nickparsa nickpa...@gmail.com wrote:
 
  Thank you Sebastian..actually I will already have one if qualified for
 the
  job. Yes, and I may fail to handle it that's why I asked for guidance.
  I wanted some tidbits to start over. I have searched through yslow,
  HTTtrack and others.
  I have searched through php list in my email too before asking this
  question. it is kind of beneficial for all people and not has been
 asked
  directly.
 
 
  Sincerely
  Negin Nickparsa
 
 
  On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs krebs@gmail.com
 wrote:
 
 
 
 
  2013/9/18 Negin Nickparsa nickpa...@gmail.com
 
  In general, what are the best ways to handle high traffic websites?
 
  VPS(clouds)?
  web analyzers?
  dedicated servers?
  distributed memory cache?
 
 
  Yes :)
 
  But seriously: That is a topic most of us spent much time to get into
 it.
  You can explain it with a bunch of buzzwords. Additional, how do you
 define
  high traffic websites? Do you already _have_ such a site? Or do you
  _want_ it? It's important, because I've seen it far too often, that
  projects spent too much effort in their high traffic infrastructure
 and
  at the end it wasn't that high traffic ;) I wont say, that you cannot
 be
  successfull, but you should start with an effort you can handle.
 
  Regards,
  Sebastian
 
 
 
 
  Sincerely
  Negin Nickparsa
 
 
 
 
  --
  github.com/KingCrunch
 
 
  Your question is way too vague to be answered properly... My best guess
 would be that it depends severely on the type of website you have and how's
 the current implementation being well... implemented.
 
  Simply said: what works for Facebook may/will not work for linkedIn,
 twitter or Google, mainly because the type of search differs A LOT:
 facebook is about relations between people, twitter is about small pieces
 of data not mainly interconnected between each other, while Google is all
 about links and all type of content: from little pieces of information
 through whole Wikipedia.
 
  You could start by studying how varnish and redis/memcached works, you
 could study about how proxies work (nginx et al), CDNs and that kind of
 stuff, but if you want more specific answers, you could better ask specific
 question.
 
  In the PHP area, an opcode cache does the job very well and can
 accelerate the page load by several orders of magnitude, I recommend
 OPCache, which is already included in PHP 5.5.
 
  Greetings.
 
 
  --
  PHP General Mailing List (http://www.php.net/)
  To unsubscribe, visit: http://www.php.net/unsub.php
 


 The original RFC states:

 https://wiki.php.net/rfc/optimizerplus
 The integration proposed for PHP 5.5.0 is mostly 'soft' integration. That
 means that there'll be no tight coupling between Optimizer+ and PHP; Those
 who wish to use another opcode cache will be able to do so, by not loading
 Optimizer+ and loading another opcode cache instead. As per the Suggested
 Roadmap above, we might want to review this decision in the future; There
 might be room for further performance or functionality gains from tighter
 integration; None are known at this point, and they're beyond the scope of
 this RFC.

 So that's why OPCache isn't enabled by default in PHP 5.5



Also worth to mention, that it is the first release with an opcode-cache
integrated. Giving the other some release to get used to it, sounds useful
:)



 Greetings.


 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php




-- 
github.com/KingCrunch


[PHP] no traffic

2012-03-06 Thread Lawrence Decker
I've been playing with PHP for about 6 years and I have no idea why this is
happening... I've been writing a script to auth to AD.  When I run the
script on my dev box, nothing.  I have wireshark running in the background
on the dev box, I can see the script's traffic go out and hit the DNS
server but no other traffic. Command line, no problem talking to other
hosts with whatever port I'm trying to hit.  On my box, all the scripts
work fine.  LDAP is enabled, but I can't hit ANY port other than DNS and if
I use the IP in the script, I see no traffic.  Both are FC16-64 patched as
of last week. I matched line-by-line in the phpinfo() on my box and the dev
box - no difference.  Used this script to try any port open on other hosts
but no traffic shows up in wireshark!! Any ideas


Lawrence



?php
 function ping($host,$post=25,$timeout=6)

 {
  $fsock = fsockopen($host, $port, $errno, $errstr, $timeout);
  if ( ! $fsock )
  {
   return FALSE;
  }
  else
  {
   return TRUE;
  }
 }

/* check if the host is up $host can also be an ip address */
$host = 'mail.bac.com';
$up = ping($host);

/* optionally display either a red or green image to signify the server
status */
echo 'img src='.($up ? 'on' : 'off').'.jpg alt='.($up ? 'up' :
'down').' /';

?


or this one



?php

//using ldap bind anonymously

// connect to ldap server
$ldapconn = ldap_connect(10.13.3.10)
or die(Could not connect to LDAP server.);

if ($ldapconn) {

// binding anonymously
$ldapbind = ldap_bind($ldapconn);

if ($ldapbind) {
echo LDAP bind anonymous successful...;
} else {
echo LDAP bind anonymous failed...;
}

}

?



phpinfo()

LDAP Support enabled RCS Version $Id: ldap.c 321634 2012-01-01 13:15:04Z
felipe $ Total Links 0/unlimited API Version 3001 Vendor Name OpenLDAP Vendor
Version 20426 SASL Support Enabled


Re: [PHP] no traffic

2012-03-06 Thread Charles
On Tue, Mar 6, 2012 at 8:55 PM, Lawrence Decker lld0...@gmail.com wrote:
 I've been playing with PHP for about 6 years and I have no idea why this is
 happening... I've been writing a script to auth to AD.  When I run the
 script on my dev box, nothing.  I have wireshark running in the background
 on the dev box, I can see the script's traffic go out and hit the DNS
 server but no other traffic. Command line, no problem talking to other
 hosts with whatever port I'm trying to hit.  On my box, all the scripts
 work fine.  LDAP is enabled, but I can't hit ANY port other than DNS and if
 I use the IP in the script, I see no traffic.  Both are FC16-64 patched as
 of last week. I matched line-by-line in the phpinfo() on my box and the dev
 box - no difference.  Used this script to try any port open on other hosts
 but no traffic shows up in wireshark!! Any ideas

Have you checked that it's not a firewall problem? e.g. by running

# telnet server-ip ldap

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] no traffic

2012-03-06 Thread Mike Mackintosh
On Mar 6, 2012, at 8:55, Lawrence Decker lld0...@gmail.com wrote:

 I've been playing with PHP for about 6 years and I have no idea why this is
 happening... I've been writing a script to auth to AD.  When I run the
 script on my dev box, nothing.  I have wireshark running in the background
 on the dev box, I can see the script's traffic go out and hit the DNS
 server but no other traffic. Command line, no problem talking to other
 hosts with whatever port I'm trying to hit.  On my box, all the scripts
 work fine.  LDAP is enabled, but I can't hit ANY port other than DNS and if
 I use the IP in the script, I see no traffic.  Both are FC16-64 patched as
 of last week. I matched line-by-line in the phpinfo() on my box and the dev
 box - no difference.  Used this script to try any port open on other hosts
 but no traffic shows up in wireshark!! Any ideas
 
 
 Lawrence
 
 
 
 ?php
 function ping($host,$post=25,$timeout=6)
 
 {
  $fsock = fsockopen($host, $port, $errno, $errstr, $timeout);
  if ( ! $fsock )
  {
   return FALSE;
  }
  else
  {
   return TRUE;
  }
 }
 
 /* check if the host is up $host can also be an ip address */
 $host = 'mail.bac.com';
 $up = ping($host);
 
 /* optionally display either a red or green image to signify the server
 status */
 echo 'img src='.($up ? 'on' : 'off').'.jpg alt='.($up ? 'up' :
 'down').' /';
 
 ?
 
 
 or this one
 
 
 
 ?php
 
 //using ldap bind anonymously
 
 // connect to ldap server
 $ldapconn = ldap_connect(10.13.3.10)
or die(Could not connect to LDAP server.);
 
 if ($ldapconn) {
 
// binding anonymously
$ldapbind = ldap_bind($ldapconn);
 
if ($ldapbind) {
echo LDAP bind anonymous successful...;
} else {
echo LDAP bind anonymous failed...;
}
 
 }
 
 ?
 
 
 
 phpinfo()
 
 LDAP Support enabled RCS Version $Id: ldap.c 321634 2012-01-01 13:15:04Z
 felipe $ Total Links 0/unlimited API Version 3001 Vendor Name OpenLDAP Vendor
 Version 20426 SASL Support Enabled

How many interfaces are on your box? From the cli can you telnet 10.13.3.10 389

Also do a netstat -na | grep 389

What returns, any open outgoing sockets?

Mike Mackintosh
ZCE PHP5.3
www.highonphp.com
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] no traffic

2012-03-06 Thread Frank Arensmeier

6 mar 2012 kl. 15.29 skrev Mike Mackintosh:

 On Mar 6, 2012, at 8:55, Lawrence Decker lld0...@gmail.com wrote:
 
 I've been playing with PHP for about 6 years and I have no idea why this is
 happening... I've been writing a script to auth to AD.  When I run the
 script on my dev box, nothing.  I have wireshark running in the background
 on the dev box, I can see the script's traffic go out and hit the DNS
 server but no other traffic. Command line, no problem talking to other
 hosts with whatever port I'm trying to hit.  On my box, all the scripts
 work fine.  LDAP is enabled, but I can't hit ANY port other than DNS and if
 I use the IP in the script, I see no traffic.  Both are FC16-64 patched as
 of last week. I matched line-by-line in the phpinfo() on my box and the dev
 box - no difference.  Used this script to try any port open on other hosts
 but no traffic shows up in wireshark!! Any ideas
 
 
 Lawrence
 
 
 
 ?php
 function ping($host,$post=25,$timeout=6)
 
 {
 $fsock = fsockopen($host, $port, $errno, $errstr, $timeout);
 if ( ! $fsock )
 {
  return FALSE;
 }
 else
 {
  return TRUE;
 }
 }

Have you noticed that you have a typo in your function? '$post' should be 
'$port'...

/frank


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] no traffic

2012-03-06 Thread Lawrence Decker
I can cli to any host/port that's open, firewall's wide open

fc-lawrence:~# telnet ad1.bac.com 389
Trying 10.13.3.10...
Connected to ad1.bac.com.
Escape character is '^]'.
^CConnection closed by foreign host.


# iptables -nL
Chain INPUT (policy ACCEPT)
target prot opt source   destination

Chain FORWARD (policy ACCEPT)
target prot opt source   destination

Chain OUTPUT (policy ACCEPT)
target prot opt source   destination




On Tue, Mar 6, 2012 at 9:29 AM, Mike Mackintosh 
mike.mackint...@angrystatic.com wrote:

 On Mar 6, 2012, at 8:55, Lawrence Decker lld0...@gmail.com wrote:

  I've been playing with PHP for about 6 years and I have no idea why this
 is
  happening... I've been writing a script to auth to AD.  When I run the
  script on my dev box, nothing.  I have wireshark running in the
 background
  on the dev box, I can see the script's traffic go out and hit the DNS
  server but no other traffic. Command line, no problem talking to other
  hosts with whatever port I'm trying to hit.  On my box, all the scripts
  work fine.  LDAP is enabled, but I can't hit ANY port other than DNS and
 if
  I use the IP in the script, I see no traffic.  Both are FC16-64 patched
 as
  of last week. I matched line-by-line in the phpinfo() on my box and the
 dev
  box - no difference.  Used this script to try any port open on other
 hosts
  but no traffic shows up in wireshark!! Any ideas
 
 
  Lawrence
 
 
 
  ?php
  function ping($host,$post=25,$timeout=6)
 
  {
   $fsock = fsockopen($host, $port, $errno, $errstr, $timeout);
   if ( ! $fsock )
   {
return FALSE;
   }
   else
   {
return TRUE;
   }
  }
 
  /* check if the host is up $host can also be an ip address */
  $host = 'mail.bac.com';
  $up = ping($host);
 
  /* optionally display either a red or green image to signify the server
  status */
  echo 'img src='.($up ? 'on' : 'off').'.jpg alt='.($up ? 'up' :
  'down').' /';
 
  ?
 
 
  or this one
 
 
 
  ?php
 
  //using ldap bind anonymously
 
  // connect to ldap server
  $ldapconn = ldap_connect(10.13.3.10)
 or die(Could not connect to LDAP server.);
 
  if ($ldapconn) {
 
 // binding anonymously
 $ldapbind = ldap_bind($ldapconn);
 
 if ($ldapbind) {
 echo LDAP bind anonymous successful...;
 } else {
 echo LDAP bind anonymous failed...;
 }
 
  }
 
  ?
 
 
 
  phpinfo()
 
  LDAP Support enabled RCS Version $Id: ldap.c 321634 2012-01-01 13:15:04Z
  felipe $ Total Links 0/unlimited API Version 3001 Vendor Name OpenLDAP
 Vendor
  Version 20426 SASL Support Enabled

 How many interfaces are on your box? From the cli can you telnet
 10.13.3.10 389

 Also do a netstat -na | grep 389

 What returns, any open outgoing sockets?

 Mike Mackintosh
 ZCE PHP5.3
 www.highonphp.com


Re: [PHP] no traffic

2012-03-06 Thread Lawrence Decker
Thanks Franks, corrected but still same problem...

On Tue, Mar 6, 2012 at 9:33 AM, Frank Arensmeier farensme...@gmail.comwrote:


 6 mar 2012 kl. 15.29 skrev Mike Mackintosh:

  On Mar 6, 2012, at 8:55, Lawrence Decker lld0...@gmail.com wrote:
 
  I've been playing with PHP for about 6 years and I have no idea why
 this is
  happening... I've been writing a script to auth to AD.  When I run the
  script on my dev box, nothing.  I have wireshark running in the
 background
  on the dev box, I can see the script's traffic go out and hit the DNS
  server but no other traffic. Command line, no problem talking to other
  hosts with whatever port I'm trying to hit.  On my box, all the scripts
  work fine.  LDAP is enabled, but I can't hit ANY port other than DNS
 and if
  I use the IP in the script, I see no traffic.  Both are FC16-64 patched
 as
  of last week. I matched line-by-line in the phpinfo() on my box and the
 dev
  box - no difference.  Used this script to try any port open on other
 hosts
  but no traffic shows up in wireshark!! Any ideas
 
 
  Lawrence
 
 
 
  ?php
  function ping($host,$post=25,$timeout=6)
 
  {
  $fsock = fsockopen($host, $port, $errno, $errstr, $timeout);
  if ( ! $fsock )
  {
   return FALSE;
  }
  else
  {
   return TRUE;
  }
  }

 Have you noticed that you have a typo in your function? '$post' should be
 '$port'...

 /frank




Re: [PHP] no traffic

2012-03-06 Thread Charles
On Tue, Mar 6, 2012 at 8:55 PM, Lawrence Decker lld0...@gmail.com wrote:
 I've been playing with PHP for about 6 years and I have no idea why this is
 happening... I've been writing a script to auth to AD.  When I run the
 script on my dev box, nothing.  I have wireshark running in the background
 on the dev box, I can see the script's traffic go out and hit the DNS
 server but no other traffic. Command line, no problem talking to other
 hosts with whatever port I'm trying to hit.  On my box, all the scripts
 work fine.  LDAP is enabled, but I can't hit ANY port other than DNS and if
 I use the IP in the script, I see no traffic.  Both are FC16-64 patched as
 of last week. I matched line-by-line in the phpinfo() on my box and the dev
 box - no difference.  Used this script to try any port open on other hosts
 but no traffic shows up in wireshark!! Any ideas

Do you have selinux enabled on your dev box?

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] no traffic

2012-03-06 Thread Lawrence Decker
YEA that was it!!! Yes, selinux is enabled.  Checked the selinux log,
and saw all the connection failures with httpd... Excellent, thanks it's
been driving me nuts!!!

On Tue, Mar 6, 2012 at 10:13 AM, Charles peac...@gmail.com wrote:

 On Tue, Mar 6, 2012 at 8:55 PM, Lawrence Decker lld0...@gmail.com wrote:
  I've been playing with PHP for about 6 years and I have no idea why this
 is
  happening... I've been writing a script to auth to AD.  When I run the
  script on my dev box, nothing.  I have wireshark running in the
 background
  on the dev box, I can see the script's traffic go out and hit the DNS
  server but no other traffic. Command line, no problem talking to other
  hosts with whatever port I'm trying to hit.  On my box, all the scripts
  work fine.  LDAP is enabled, but I can't hit ANY port other than DNS and
 if
  I use the IP in the script, I see no traffic.  Both are FC16-64 patched
 as
  of last week. I matched line-by-line in the phpinfo() on my box and the
 dev
  box - no difference.  Used this script to try any port open on other
 hosts
  but no traffic shows up in wireshark!! Any ideas

 Do you have selinux enabled on your dev box?



[PHP] Re: Traffic throttling

2009-07-21 Thread Michelle Konzack
Hello Stuart,

Am 2009-07-21 16:39:30, schrieb Stuart:
 http://php.net/usleep

Thank you, that it was.

Greetings and nice Day/Evening
Michelle Konzack
Systemadministrator
Tamay Dogan Network
Debian GNU/Linux Consultant

-- 
Linux-User #280138 with the Linux Counter, http://counter.li.org/
# Debian GNU/Linux Consultant #
http://www.tamay-dogan.net/ Michelle Konzack
http://www.can4linux.org/   c/o Vertriebsp. KabelBW
http://www.flexray4linux.org/   Blumenstrasse 2
Jabber linux4miche...@jabber.ccc.de   77694 Kehl/Germany
IRC #Debian (irc.icq.com) Tel. DE: +49 177 9351947
ICQ #328449886Tel. FR: +33  6  61925193


signature.pgp
Description: Digital signature


[PHP] heavy traffic portal site

2003-06-03 Thread Adrian Teasdale
Hi there

We have been contacted about creating a portal site which will have some
heavy usage.  They are talking about having 100,000 subscribed users to the
system which will have the following:

1. Web based email

2. Calender (for the persons own use, not shared)

3. File store (and sharing) and image store (and sharing)

and a few other things.  The above are the areas that we think would create
the most load on the system.  Firstly, does anyone have an Open Source
application that they would recommend for the above and are there any
examples of other people using this in the real world? If there is no
application that would handle all of the above, what would people suggest?

Thanks

Ade


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] heavy traffic portal site

2003-06-03 Thread Mark
I see you posted this to the Horde list, so you're aware of that
project. Horde has been used in a few very large installations, but
more important than the number of users is the load on the app. Will
the 100k users be on it very frequently? Rarely? What kind of hit
level are we talking about? Much of that could also be dependent on
the hardware you have available.

Also, are you looking for a single, (semi)-integrated app such as
Horde, or would you be interested in individual solutions that could
be merged together?

--- Adrian Teasdale [EMAIL PROTECTED] wrote:
 Hi there
 
 We have been contacted about creating a portal site which will have
 some
 heavy usage.  They are talking about having 100,000 subscribed
 users to the
 system which will have the following:
 
 1. Web based email
 
 2. Calender (for the persons own use, not shared)
 
 3. File store (and sharing) and image store (and sharing)
 
 and a few other things.  The above are the areas that we think
 would create
 the most load on the system.  Firstly, does anyone have an Open
 Source
 application that they would recommend for the above and are there
 any
 examples of other people using this in the real world? If there is
 no
 application that would handle all of the above, what would people
 suggest?
 
 Thanks
 
 Ade
 
 
 -- 
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 


=
Mark Weinstock
[EMAIL PROTECTED]
***
You can't demand something as a right unless you are willing to fight to death to 
defend everyone else's right to the same thing.
***

__
Do you Yahoo!?
Yahoo! Calendar - Free online calendar with sync to Outlook(TM).
http://calendar.yahoo.com

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP] heavy traffic portal site

2003-06-03 Thread Adrian Teasdale
Mark

Thanks for the quick reply. Basically, I think that the users will be using
it quite frequently - well, that's my clients idea of how it will work! In
terms of hit level, we really don't know at this stage.  In terms of
hardware, they have a good budget and we would be looking to work with
someone like rackpsace to cover all the hardware issues.

Regarding your last comment, we aren't looking necessarily for a single appp
to cover everything, but this is why Horde was looked at first. However, we
will probably have to do some fairly heavy tweeking with whatever we go with
so that everything matches together (as there are a few other PHP apps that
need to be built and integrated into the whole). One aspect is that they
want to skin the whole site so that it can be altered. On top of that is a
crazy deadline too, just to make things real fun!

Any comments, laughter, suggestions - all appreciated

Ade

 -Original Message-
 From: Mark [mailto:[EMAIL PROTECTED]
 Sent: 03 June 2003 11:27
 To: [EMAIL PROTECTED]; [EMAIL PROTECTED]
 Subject: Re: [PHP] heavy traffic portal site


 I see you posted this to the Horde list, so you're aware of that
 project. Horde has been used in a few very large installations, but
 more important than the number of users is the load on the app. Will
 the 100k users be on it very frequently? Rarely? What kind of hit
 level are we talking about? Much of that could also be dependent on
 the hardware you have available.

 Also, are you looking for a single, (semi)-integrated app such as
 Horde, or would you be interested in individual solutions that could
 be merged together?

 --- Adrian Teasdale [EMAIL PROTECTED] wrote:
  Hi there
 
  We have been contacted about creating a portal site which will have
  some
  heavy usage.  They are talking about having 100,000 subscribed
  users to the
  system which will have the following:
 
  1. Web based email
 
  2. Calender (for the persons own use, not shared)
 
  3. File store (and sharing) and image store (and sharing)
 
  and a few other things.  The above are the areas that we think
  would create
  the most load on the system.  Firstly, does anyone have an Open
  Source
  application that they would recommend for the above and are there
  any
  examples of other people using this in the real world? If there is
  no
  application that would handle all of the above, what would people
  suggest?
 
  Thanks
 
  Ade
 
 
  --
  PHP General Mailing List (http://www.php.net/)
  To unsubscribe, visit: http://www.php.net/unsub.php
 


 =
 Mark Weinstock
 [EMAIL PROTECTED]
 ***
 You can't demand something as a right unless you are willing to
 fight to death to defend everyone else's right to the same thing.
 ***

 __
 Do you Yahoo!?
 Yahoo! Calendar - Free online calendar with sync to Outlook(TM).
 http://calendar.yahoo.com

 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php






-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



[PHP] outbound traffic, sessions reg expressions

2002-01-25 Thread Justin French

Hi,

I want to create a file (out.php???) which measures and keeps stats on
the sites we link OUT to, including affiliates... i've got a few ideas
how to do this, so that's not the problem... the problem is that a lot
of the content on the site is contributed through writers, not
programmers, and since we know and trust these contributors, we're
comfortable with them being allowed to add links in the text, etc etc,
and don't strip those tags.

however, it's one thing to teach a whole heap of writers to link in this way:
A HREF=http://somewhere; TARGET=_newclick/a

but it's a lot harder (or perhaps, more prone to errors, etc) to ask
them to do something like:
A HREF=out.php?url=http://somewhere.com;click/a

... and then if it's wise to take the special characters out, replacing
them with ASCI (eg %20), then it's virtualy impossible.


THEN I thought about the fact that i'll be looking to add URL based
sessiosn to the site soon, which will require all links to carry a
session id...  again, too much to ask of them, and too unreliable.


So, one solution would be to use regular expressions to analyse text for
links, determin if they are internal or external, check for
target=_new, encode sessions, etc etc.  Big job for me, maybe not for
some of the regexp pros :)

OR, another option would be to establish a custom tag LINK (for
internal) and ELINK for external pages eg ELINK
http://somewhere.com;click/a, and let PHP convert it to a suitable
link, with sessions, TARGET=_new, etc etc... and better still, take
that link, and pipe it through something like out.php as discussed
above, without anyone having to worry about fussy syntax, sessions, etc.


Has anyone tried anything like this?  has anyone got code snippets, or
perhaps the ability to write the reg exps?


Many thanks,

Justin French

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]




[PHP] Monitoring traffic

2001-09-14 Thread SED

Hi,

This is kind of off topic but I need to monitor the traffic in
kilobytes/bits on my net card adapter (in/out) or server (PHP, HTML,
etc.), do you know of any software? I'm using Win2000.

Thanks,
SED


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]




Re: [PHP] Monitoring traffic

2001-09-14 Thread Jason Bell

check out www.mrtg.org for a pre-packaged solution, otherwise check out the
SNMP functions of PHP.

http://www.php.net/manual/en/ref.snmp.php

in either case, you'll need to install the snmp support from your win2000
installation disk.

/* SED asked: */

 This is kind of off topic but I need to monitor the traffic in
 kilobytes/bits on my net card adapter (in/out) or server (PHP, HTML,
 etc.), do you know of any software? I'm using Win2000.

 Thanks,
 SED



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]




Re: [PHP] web traffic report

2001-08-03 Thread Andreas D. Landmark

At 03.08.2001 04:44, mike cullerton wrote:
another vote for analog.

Dunno what this has got to do with php, but my vote is for
webalyzer... fast and easy to use...


-- 
Andreas D Landmark / noXtension
Real Time, adj.:
 Here and now, as opposed to fake time, which only occurs there
and then.


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]




Re: [PHP] web traffic report

2001-08-03 Thread Corey Chapman

Can't you just run a cron job (set it to do it automatically every so 
often) to delete the file from your web account's tmp folder.. ?

 Has anyone figured a way to purge access-log files after webalizer is 
done
 using to data only so far back?
 
 -eric
 - Original Message -
  I'm rather fond of Webalizer (www.mrunix.net/webalizer).
 
 
 
 -- 
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]
 To contact the list administrators, e-mail: php-list-
[EMAIL PROTECTED]
 
 
 

Corey Chapman
Xnull CEO
(Chat with us: http://forum.xnull.com)

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]




Re: [PHP] web traffic report

2001-08-03 Thread mike cullerton

another vote for analog.

on 8/2/01 9:46 PM, Chris Fry at [EMAIL PROTECTED] wrote:

 analog seems to be the industry standard - use it with the extended log
 format.
 
 http://www.statslab.cam.ac.uk/~sret1/analog/
 


 -- mike cullerton



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]




Re: [PHP] web traffic report

2001-08-03 Thread Richard Lynch

Your process should be:
logrotate access_log
Use Webalizer on old log.
Purge old log.
Webalizer does not need the raw data after it has processed it once.

--
WARNING [EMAIL PROTECTED] address is an endangered species -- Use
[EMAIL PROTECTED]
Wanna help me out?  Like Music?  Buy a CD: http://l-i-e.com/artists.htm
Volunteer a little time: http://chatmusic.com/volunteer.htm
- Original Message -
From: Corey Chapman [EMAIL PROTECTED]
Newsgroups: php.general
To: Eric Wood [EMAIL PROTECTED]; [EMAIL PROTECTED]
Sent: Friday, August 03, 2001 1:38 PM
Subject: Re: [PHP] web traffic report


 Can't you just run a cron job (set it to do it automatically every so
 often) to delete the file from your web account's tmp folder.. ?

  Has anyone figured a way to purge access-log files after webalizer is
 done
  using to data only so far back?
 
  -eric
  - Original Message -
   I'm rather fond of Webalizer (www.mrunix.net/webalizer).
 
 
 
  --
  PHP General Mailing List (http://www.php.net/)
  To unsubscribe, e-mail: [EMAIL PROTECTED]
  For additional commands, e-mail: [EMAIL PROTECTED]
  To contact the list administrators, e-mail: php-list-
 [EMAIL PROTECTED]
 
 
 

 Corey Chapman
 Xnull CEO
 (Chat with us: http://forum.xnull.com)


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]