Re: [PHP] high traffic websites

2013-09-19 Thread Negin Nickparsa
it may be helpful for someone.
I liked GTmetrix kinda helpful and magic. http://gtmetrix.com/#!


Sincerely
Negin Nickparsa


On Wed, Sep 18, 2013 at 4:42 PM, Sebastian Krebs krebs@gmail.comwrote:

 2013/9/18 Camilo Sperberg unrea...@gmail.com

 
  On Sep 18, 2013, at 14:26, Haluk Karamete halukkaram...@gmail.com
 wrote:
 
   I recommend OPCache, which is already included in PHP 5.5.
  
   Camilo,
   I'm just curious about the disadvantageous aspects of OPcache.
  
   My logic says there must be some issues with it otherwise it would
  have
  come already enabled.
  
   Sent from iPhone
  
  
   On Sep 18, 2013, at 2:20 AM, Camilo Sperberg unrea...@gmail.com
 wrote:
  
  
   On Sep 18, 2013, at 09:38, Negin Nickparsa nickpa...@gmail.com
 wrote:
  
   Thank you Sebastian..actually I will already have one if qualified
 for
  the
   job. Yes, and I may fail to handle it that's why I asked for
 guidance.
   I wanted some tidbits to start over. I have searched through yslow,
   HTTtrack and others.
   I have searched through php list in my email too before asking this
   question. it is kind of beneficial for all people and not has been
  asked
   directly.
  
  
   Sincerely
   Negin Nickparsa
  
  
   On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs 
 krebs@gmail.com
  wrote:
  
  
  
  
   2013/9/18 Negin Nickparsa nickpa...@gmail.com
  
   In general, what are the best ways to handle high traffic websites?
  
   VPS(clouds)?
   web analyzers?
   dedicated servers?
   distributed memory cache?
  
  
   Yes :)
  
   But seriously: That is a topic most of us spent much time to get
 into
  it.
   You can explain it with a bunch of buzzwords. Additional, how do you
  define
   high traffic websites? Do you already _have_ such a site? Or do
 you
   _want_ it? It's important, because I've seen it far too often, that
   projects spent too much effort in their high traffic
 infrastructure
  and
   at the end it wasn't that high traffic ;) I wont say, that you
 cannot
  be
   successfull, but you should start with an effort you can handle.
  
   Regards,
   Sebastian
  
  
  
  
   Sincerely
   Negin Nickparsa
  
  
  
  
   --
   github.com/KingCrunch
  
  
   Your question is way too vague to be answered properly... My best
 guess
  would be that it depends severely on the type of website you have and
 how's
  the current implementation being well... implemented.
  
   Simply said: what works for Facebook may/will not work for linkedIn,
  twitter or Google, mainly because the type of search differs A LOT:
  facebook is about relations between people, twitter is about small pieces
  of data not mainly interconnected between each other, while Google is all
  about links and all type of content: from little pieces of information
  through whole Wikipedia.
  
   You could start by studying how varnish and redis/memcached works, you
  could study about how proxies work (nginx et al), CDNs and that kind of
  stuff, but if you want more specific answers, you could better ask
 specific
  question.
  
   In the PHP area, an opcode cache does the job very well and can
  accelerate the page load by several orders of magnitude, I recommend
  OPCache, which is already included in PHP 5.5.
  
   Greetings.
  
  
   --
   PHP General Mailing List (http://www.php.net/)
   To unsubscribe, visit: http://www.php.net/unsub.php
  
 
 
  The original RFC states:
 
  https://wiki.php.net/rfc/optimizerplus
  The integration proposed for PHP 5.5.0 is mostly 'soft' integration. That
  means that there'll be no tight coupling between Optimizer+ and PHP;
 Those
  who wish to use another opcode cache will be able to do so, by not
 loading
  Optimizer+ and loading another opcode cache instead. As per the Suggested
  Roadmap above, we might want to review this decision in the future; There
  might be room for further performance or functionality gains from tighter
  integration; None are known at this point, and they're beyond the scope
 of
  this RFC.
 
  So that's why OPCache isn't enabled by default in PHP 5.5
 


 Also worth to mention, that it is the first release with an opcode-cache
 integrated. Giving the other some release to get used to it, sounds useful
 :)


 
  Greetings.
 
 
  --
  PHP General Mailing List (http://www.php.net/)
  To unsubscribe, visit: http://www.php.net/unsub.php
 
 


 --
 github.com/KingCrunch



Re: [PHP] high traffic websites

2013-09-18 Thread Sebastian Krebs
2013/9/18 Negin Nickparsa nickpa...@gmail.com

 In general, what are the best ways to handle high traffic websites?

 VPS(clouds)?
 web analyzers?
 dedicated servers?
 distributed memory cache?


Yes :)

But seriously: That is a topic most of us spent much time to get into it.
You can explain it with a bunch of buzzwords. Additional, how do you define
high traffic websites? Do you already _have_ such a site? Or do you
_want_ it? It's important, because I've seen it far too often, that
projects spent too much effort in their high traffic infrastructure and
at the end it wasn't that high traffic ;) I wont say, that you cannot be
successfull, but you should start with an effort you can handle.

Regards,
Sebastian




 Sincerely
 Negin Nickparsa




-- 
github.com/KingCrunch


Re: [PHP] high traffic websites

2013-09-18 Thread Negin Nickparsa
Thank you Sebastian..actually I will already have one if qualified for the
job. Yes, and I may fail to handle it that's why I asked for guidance.
I wanted some tidbits to start over. I have searched through yslow,
HTTtrack and others.
I have searched through php list in my email too before asking this
question. it is kind of beneficial for all people and not has been asked
directly.


Sincerely
Negin Nickparsa


On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs krebs@gmail.comwrote:




 2013/9/18 Negin Nickparsa nickpa...@gmail.com

 In general, what are the best ways to handle high traffic websites?

 VPS(clouds)?
 web analyzers?
 dedicated servers?
 distributed memory cache?


 Yes :)

 But seriously: That is a topic most of us spent much time to get into it.
 You can explain it with a bunch of buzzwords. Additional, how do you define
 high traffic websites? Do you already _have_ such a site? Or do you
 _want_ it? It's important, because I've seen it far too often, that
 projects spent too much effort in their high traffic infrastructure and
 at the end it wasn't that high traffic ;) I wont say, that you cannot be
 successfull, but you should start with an effort you can handle.

 Regards,
 Sebastian




 Sincerely
 Negin Nickparsa




 --
 github.com/KingCrunch



Re: [PHP] high traffic websites

2013-09-18 Thread Camilo Sperberg

On Sep 18, 2013, at 09:38, Negin Nickparsa nickpa...@gmail.com wrote:

 Thank you Sebastian..actually I will already have one if qualified for the
 job. Yes, and I may fail to handle it that's why I asked for guidance.
 I wanted some tidbits to start over. I have searched through yslow,
 HTTtrack and others.
 I have searched through php list in my email too before asking this
 question. it is kind of beneficial for all people and not has been asked
 directly.
 
 
 Sincerely
 Negin Nickparsa
 
 
 On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs krebs@gmail.comwrote:
 
 
 
 
 2013/9/18 Negin Nickparsa nickpa...@gmail.com
 
 In general, what are the best ways to handle high traffic websites?
 
 VPS(clouds)?
 web analyzers?
 dedicated servers?
 distributed memory cache?
 
 
 Yes :)
 
 But seriously: That is a topic most of us spent much time to get into it.
 You can explain it with a bunch of buzzwords. Additional, how do you define
 high traffic websites? Do you already _have_ such a site? Or do you
 _want_ it? It's important, because I've seen it far too often, that
 projects spent too much effort in their high traffic infrastructure and
 at the end it wasn't that high traffic ;) I wont say, that you cannot be
 successfull, but you should start with an effort you can handle.
 
 Regards,
 Sebastian
 
 
 
 
 Sincerely
 Negin Nickparsa
 
 
 
 
 --
 github.com/KingCrunch
 

Your question is way too vague to be answered properly... My best guess would 
be that it depends severely on the type of website you have and how's the 
current implementation being well... implemented.

Simply said: what works for Facebook may/will not work for linkedIn, twitter or 
Google, mainly because the type of search differs A LOT: facebook is about 
relations between people, twitter is about small pieces of data not mainly 
interconnected between each other, while Google is all about links and all type 
of content: from little pieces of information through whole Wikipedia.

You could start by studying how varnish and redis/memcached works, you could 
study about how proxies work (nginx et al), CDNs and that kind of stuff, but if 
you want more specific answers, you could better ask specific question.

In the PHP area, an opcode cache does the job very well and can accelerate the 
page load by several orders of magnitude, I recommend OPCache, which is already 
included in PHP 5.5.

Greetings.


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] high traffic websites

2013-09-18 Thread Negin Nickparsa
Thank you Camilo

to be more in details,suppose the website has 80,000 users and each page
takes 200 ms to be rendered and you have thousand hits in a second so we
want to reduce the time of rendering. is there any way to reduce the
rendering time?

other thing is suppose they want to upload files simultaneously and the
videos are in the website not on another server like YouTube and so streams
are really consuming the bandwidth.

Also,It is troublesome to get backups,when getting backups you have problem
of lock backing up with bulk of data.



Sincerely
Negin Nickparsa


On Wed, Sep 18, 2013 at 12:50 PM, Camilo Sperberg unrea...@gmail.comwrote:


 On Sep 18, 2013, at 09:38, Negin Nickparsa nickpa...@gmail.com wrote:

  Thank you Sebastian..actually I will already have one if qualified for
 the
  job. Yes, and I may fail to handle it that's why I asked for guidance.
  I wanted some tidbits to start over. I have searched through yslow,
  HTTtrack and others.
  I have searched through php list in my email too before asking this
  question. it is kind of beneficial for all people and not has been asked
  directly.
 
 
  Sincerely
  Negin Nickparsa
 
 
  On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs krebs@gmail.com
 wrote:
 
 
 
 
  2013/9/18 Negin Nickparsa nickpa...@gmail.com
 
  In general, what are the best ways to handle high traffic websites?
 
  VPS(clouds)?
  web analyzers?
  dedicated servers?
  distributed memory cache?
 
 
  Yes :)
 
  But seriously: That is a topic most of us spent much time to get into
 it.
  You can explain it with a bunch of buzzwords. Additional, how do you
 define
  high traffic websites? Do you already _have_ such a site? Or do you
  _want_ it? It's important, because I've seen it far too often, that
  projects spent too much effort in their high traffic infrastructure
 and
  at the end it wasn't that high traffic ;) I wont say, that you cannot be
  successfull, but you should start with an effort you can handle.
 
  Regards,
  Sebastian
 
 
 
 
  Sincerely
  Negin Nickparsa
 
 
 
 
  --
  github.com/KingCrunch
 

 Your question is way too vague to be answered properly... My best guess
 would be that it depends severely on the type of website you have and how's
 the current implementation being well... implemented.

 Simply said: what works for Facebook may/will not work for linkedIn,
 twitter or Google, mainly because the type of search differs A LOT:
 facebook is about relations between people, twitter is about small pieces
 of data not mainly interconnected between each other, while Google is all
 about links and all type of content: from little pieces of information
 through whole Wikipedia.

 You could start by studying how varnish and redis/memcached works, you
 could study about how proxies work (nginx et al), CDNs and that kind of
 stuff, but if you want more specific answers, you could better ask specific
 question.

 In the PHP area, an opcode cache does the job very well and can accelerate
 the page load by several orders of magnitude, I recommend OPCache, which is
 already included in PHP 5.5.

 Greetings.




Re: [PHP] high traffic websites

2013-09-18 Thread Sebastian Krebs
2013/9/18 Negin Nickparsa nickpa...@gmail.com

 Thank you Camilo

 to be more in details,suppose the website has 80,000 users and each page
 takes 200 ms to be rendered and you have thousand hits in a second so we
 want to reduce the time of rendering. is there any way to reduce the
 rendering time?


Read about frontend-/proxy-caching (Nginx, Varnish) and ESI/SSI-include
(also NGinx and Varnish ;)). The idea is simply If you don't have to
process on every request in the backend, don't process it in the backend on
every request.

But maybe you mixed up some words, because the rendering time is the time
consumed by the renderer within the browser (HTML and CSS). This you can
improve, if you improve your HTML/CSS :)


I am a little bit curious: Do you _really_ have 1000 requests/second, or do
you just throw some numbers in? ;)



 other thing is suppose they want to upload files simultaneously and the
 videos are in the website not on another server like YouTube and so streams
 are really consuming the bandwidth.


Well, if there are streams, there are streams. I cannot imagine, that there
is another way someone can stream a video without downloading it.



 Also,It is troublesome to get backups,when getting backups you have
 problem of lock backing up with bulk of data.


Even in times, where there is not that much traffix? Automatic backup at
3:00 in the morning for example?





 Sincerely
 Negin Nickparsa


 On Wed, Sep 18, 2013 at 12:50 PM, Camilo Sperberg unrea...@gmail.comwrote:


 On Sep 18, 2013, at 09:38, Negin Nickparsa nickpa...@gmail.com wrote:

  Thank you Sebastian..actually I will already have one if qualified for
 the
  job. Yes, and I may fail to handle it that's why I asked for guidance.
  I wanted some tidbits to start over. I have searched through yslow,
  HTTtrack and others.
  I have searched through php list in my email too before asking this
  question. it is kind of beneficial for all people and not has been asked
  directly.
 
 
  Sincerely
  Negin Nickparsa
 
 
  On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs krebs@gmail.com
 wrote:
 
 
 
 
  2013/9/18 Negin Nickparsa nickpa...@gmail.com
 
  In general, what are the best ways to handle high traffic websites?
 
  VPS(clouds)?
  web analyzers?
  dedicated servers?
  distributed memory cache?
 
 
  Yes :)
 
  But seriously: That is a topic most of us spent much time to get into
 it.
  You can explain it with a bunch of buzzwords. Additional, how do you
 define
  high traffic websites? Do you already _have_ such a site? Or do you
  _want_ it? It's important, because I've seen it far too often, that
  projects spent too much effort in their high traffic infrastructure
 and
  at the end it wasn't that high traffic ;) I wont say, that you cannot
 be
  successfull, but you should start with an effort you can handle.
 
  Regards,
  Sebastian
 
 
 
 
  Sincerely
  Negin Nickparsa
 
 
 
 
  --
  github.com/KingCrunch
 

 Your question is way too vague to be answered properly... My best guess
 would be that it depends severely on the type of website you have and how's
 the current implementation being well... implemented.

 Simply said: what works for Facebook may/will not work for linkedIn,
 twitter or Google, mainly because the type of search differs A LOT:
 facebook is about relations between people, twitter is about small pieces
 of data not mainly interconnected between each other, while Google is all
 about links and all type of content: from little pieces of information
 through whole Wikipedia.

 You could start by studying how varnish and redis/memcached works, you
 could study about how proxies work (nginx et al), CDNs and that kind of
 stuff, but if you want more specific answers, you could better ask specific
 question.

 In the PHP area, an opcode cache does the job very well and can
 accelerate the page load by several orders of magnitude, I recommend
 OPCache, which is already included in PHP 5.5.

 Greetings.





-- 
github.com/KingCrunch


Re: [PHP] high traffic websites

2013-09-18 Thread Stuart Dallas
On 18 Sep 2013, at 12:50, Negin Nickparsa nickpa...@gmail.com wrote:

 to be more in details,suppose the website has 80,000 users and each page
 takes 200 ms to be rendered and you have thousand hits in a second so we
 want to reduce the time of rendering. is there any way to reduce the
 rendering time?
 
 other thing is suppose they want to upload files simultaneously and the
 videos are in the website not on another server like YouTube and so streams
 are really consuming the bandwidth.
 
 Also,It is troublesome to get backups,when getting backups you have problem
 of lock backing up with bulk of data.

Your question is impossible to answer efficiently without profiling. You need 
to know what PHP is doing in those 200ms before you can target your 
optimisations for maximum effect.

I use xdebug to produce trace files. From there I can see exactly what is 
taking the most amount of time, and then I can look in to how to make that 
thing faster. When I'm certain there is no faster way to do what it's doing I 
move on to the next biggest thing.

Of course there are generic things you should do such as adding an opcode cache 
and looking at your server setup, but targeted optimisation is far better than 
trying generic stuff.

-Stuart

-- 
Stuart Dallas
3ft9 Ltd
http://3ft9.com/

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] high traffic websites

2013-09-18 Thread Negin Nickparsa
I am a little bit curious: Do you _really_ have 1000 requests/second, or do
you just throw some numbers in? ;)

Sebastian, supposedly_asking_to_get_some_pre_evaluation :)

Even in times, where there is not that much traffix? Automatic backup at
3:00 in the morning for example?

3:00 morning in one country is 9 Am in other country, 3 PM in other country
.

By the way Thank you so much guys, I wanted tidbits and you gave me more.

Stuart, I recall your replies in other situations and always you helped me
to improve.list is happy to have you.

Sincerely
Negin Nickparsa


On Wed, Sep 18, 2013 at 3:39 PM, Sebastian Krebs krebs@gmail.comwrote:




 2013/9/18 Negin Nickparsa nickpa...@gmail.com

 Thank you Camilo

 to be more in details,suppose the website has 80,000 users and each page
 takes 200 ms to be rendered and you have thousand hits in a second so we
 want to reduce the time of rendering. is there any way to reduce the
 rendering time?


 Read about frontend-/proxy-caching (Nginx, Varnish) and ESI/SSI-include
 (also NGinx and Varnish ;)). The idea is simply If you don't have to
 process on every request in the backend, don't process it in the backend on
 every request.

 But maybe you mixed up some words, because the rendering time is the time
 consumed by the renderer within the browser (HTML and CSS). This you can
 improve, if you improve your HTML/CSS :)


 I am a little bit curious: Do you _really_ have 1000 requests/second, or
 do you just throw some numbers in? ;)



 other thing is suppose they want to upload files simultaneously and the
 videos are in the website not on another server like YouTube and so streams
 are really consuming the bandwidth.


 Well, if there are streams, there are streams. I cannot imagine, that
 there is another way someone can stream a video without downloading it.



 Also,It is troublesome to get backups,when getting backups you have
 problem of lock backing up with bulk of data.


 Even in times, where there is not that much traffix? Automatic backup at
 3:00 in the morning for example?





 Sincerely
 Negin Nickparsa


 On Wed, Sep 18, 2013 at 12:50 PM, Camilo Sperberg unrea...@gmail.comwrote:


 On Sep 18, 2013, at 09:38, Negin Nickparsa nickpa...@gmail.com wrote:

  Thank you Sebastian..actually I will already have one if qualified for
 the
  job. Yes, and I may fail to handle it that's why I asked for guidance.
  I wanted some tidbits to start over. I have searched through yslow,
  HTTtrack and others.
  I have searched through php list in my email too before asking this
  question. it is kind of beneficial for all people and not has been
 asked
  directly.
 
 
  Sincerely
  Negin Nickparsa
 
 
  On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs krebs@gmail.com
 wrote:
 
 
 
 
  2013/9/18 Negin Nickparsa nickpa...@gmail.com
 
  In general, what are the best ways to handle high traffic websites?
 
  VPS(clouds)?
  web analyzers?
  dedicated servers?
  distributed memory cache?
 
 
  Yes :)
 
  But seriously: That is a topic most of us spent much time to get into
 it.
  You can explain it with a bunch of buzzwords. Additional, how do you
 define
  high traffic websites? Do you already _have_ such a site? Or do you
  _want_ it? It's important, because I've seen it far too often, that
  projects spent too much effort in their high traffic infrastructure
 and
  at the end it wasn't that high traffic ;) I wont say, that you cannot
 be
  successfull, but you should start with an effort you can handle.
 
  Regards,
  Sebastian
 
 
 
 
  Sincerely
  Negin Nickparsa
 
 
 
 
  --
  github.com/KingCrunch
 

 Your question is way too vague to be answered properly... My best guess
 would be that it depends severely on the type of website you have and how's
 the current implementation being well... implemented.

 Simply said: what works for Facebook may/will not work for linkedIn,
 twitter or Google, mainly because the type of search differs A LOT:
 facebook is about relations between people, twitter is about small pieces
 of data not mainly interconnected between each other, while Google is all
 about links and all type of content: from little pieces of information
 through whole Wikipedia.

 You could start by studying how varnish and redis/memcached works, you
 could study about how proxies work (nginx et al), CDNs and that kind of
 stuff, but if you want more specific answers, you could better ask specific
 question.

 In the PHP area, an opcode cache does the job very well and can
 accelerate the page load by several orders of magnitude, I recommend
 OPCache, which is already included in PHP 5.5.

 Greetings.





 --
 github.com/KingCrunch



Re: [PHP] high traffic websites

2013-09-18 Thread Camilo Sperberg

On Sep 18, 2013, at 14:26, Haluk Karamete halukkaram...@gmail.com wrote:

 I recommend OPCache, which is already included in PHP 5.5.
 
 Camilo,
 I'm just curious about the disadvantageous aspects of OPcache. 
 
 My logic says there must be some issues with it otherwise it would  have come 
 already enabled.   
 
 Sent from iPhone 
 
 
 On Sep 18, 2013, at 2:20 AM, Camilo Sperberg unrea...@gmail.com wrote:
 
 
 On Sep 18, 2013, at 09:38, Negin Nickparsa nickpa...@gmail.com wrote:
 
 Thank you Sebastian..actually I will already have one if qualified for the
 job. Yes, and I may fail to handle it that's why I asked for guidance.
 I wanted some tidbits to start over. I have searched through yslow,
 HTTtrack and others.
 I have searched through php list in my email too before asking this
 question. it is kind of beneficial for all people and not has been asked
 directly.
 
 
 Sincerely
 Negin Nickparsa
 
 
 On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs 
 krebs@gmail.comwrote:
 
 
 
 
 2013/9/18 Negin Nickparsa nickpa...@gmail.com
 
 In general, what are the best ways to handle high traffic websites?
 
 VPS(clouds)?
 web analyzers?
 dedicated servers?
 distributed memory cache?
 
 
 Yes :)
 
 But seriously: That is a topic most of us spent much time to get into it.
 You can explain it with a bunch of buzzwords. Additional, how do you define
 high traffic websites? Do you already _have_ such a site? Or do you
 _want_ it? It's important, because I've seen it far too often, that
 projects spent too much effort in their high traffic infrastructure and
 at the end it wasn't that high traffic ;) I wont say, that you cannot be
 successfull, but you should start with an effort you can handle.
 
 Regards,
 Sebastian
 
 
 
 
 Sincerely
 Negin Nickparsa
 
 
 
 
 --
 github.com/KingCrunch
 
 
 Your question is way too vague to be answered properly... My best guess 
 would be that it depends severely on the type of website you have and how's 
 the current implementation being well... implemented.
 
 Simply said: what works for Facebook may/will not work for linkedIn, twitter 
 or Google, mainly because the type of search differs A LOT: facebook is 
 about relations between people, twitter is about small pieces of data not 
 mainly interconnected between each other, while Google is all about links 
 and all type of content: from little pieces of information through whole 
 Wikipedia.
 
 You could start by studying how varnish and redis/memcached works, you could 
 study about how proxies work (nginx et al), CDNs and that kind of stuff, but 
 if you want more specific answers, you could better ask specific question.
 
 In the PHP area, an opcode cache does the job very well and can accelerate 
 the page load by several orders of magnitude, I recommend OPCache, which is 
 already included in PHP 5.5.
 
 Greetings.
 
 
 -- 
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 


The original RFC states: 

https://wiki.php.net/rfc/optimizerplus
The integration proposed for PHP 5.5.0 is mostly 'soft' integration. That means 
that there'll be no tight coupling between Optimizer+ and PHP; Those who wish 
to use another opcode cache will be able to do so, by not loading Optimizer+ 
and loading another opcode cache instead. As per the Suggested Roadmap above, 
we might want to review this decision in the future; There might be room for 
further performance or functionality gains from tighter integration; None are 
known at this point, and they're beyond the scope of this RFC.

So that's why OPCache isn't enabled by default in PHP 5.5

Greetings.


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] high traffic websites

2013-09-18 Thread Sebastian Krebs
2013/9/18 Camilo Sperberg unrea...@gmail.com


 On Sep 18, 2013, at 14:26, Haluk Karamete halukkaram...@gmail.com wrote:

  I recommend OPCache, which is already included in PHP 5.5.
 
  Camilo,
  I'm just curious about the disadvantageous aspects of OPcache.
 
  My logic says there must be some issues with it otherwise it would  have
 come already enabled.
 
  Sent from iPhone
 
 
  On Sep 18, 2013, at 2:20 AM, Camilo Sperberg unrea...@gmail.com wrote:
 
 
  On Sep 18, 2013, at 09:38, Negin Nickparsa nickpa...@gmail.com wrote:
 
  Thank you Sebastian..actually I will already have one if qualified for
 the
  job. Yes, and I may fail to handle it that's why I asked for guidance.
  I wanted some tidbits to start over. I have searched through yslow,
  HTTtrack and others.
  I have searched through php list in my email too before asking this
  question. it is kind of beneficial for all people and not has been
 asked
  directly.
 
 
  Sincerely
  Negin Nickparsa
 
 
  On Wed, Sep 18, 2013 at 10:45 AM, Sebastian Krebs krebs@gmail.com
 wrote:
 
 
 
 
  2013/9/18 Negin Nickparsa nickpa...@gmail.com
 
  In general, what are the best ways to handle high traffic websites?
 
  VPS(clouds)?
  web analyzers?
  dedicated servers?
  distributed memory cache?
 
 
  Yes :)
 
  But seriously: That is a topic most of us spent much time to get into
 it.
  You can explain it with a bunch of buzzwords. Additional, how do you
 define
  high traffic websites? Do you already _have_ such a site? Or do you
  _want_ it? It's important, because I've seen it far too often, that
  projects spent too much effort in their high traffic infrastructure
 and
  at the end it wasn't that high traffic ;) I wont say, that you cannot
 be
  successfull, but you should start with an effort you can handle.
 
  Regards,
  Sebastian
 
 
 
 
  Sincerely
  Negin Nickparsa
 
 
 
 
  --
  github.com/KingCrunch
 
 
  Your question is way too vague to be answered properly... My best guess
 would be that it depends severely on the type of website you have and how's
 the current implementation being well... implemented.
 
  Simply said: what works for Facebook may/will not work for linkedIn,
 twitter or Google, mainly because the type of search differs A LOT:
 facebook is about relations between people, twitter is about small pieces
 of data not mainly interconnected between each other, while Google is all
 about links and all type of content: from little pieces of information
 through whole Wikipedia.
 
  You could start by studying how varnish and redis/memcached works, you
 could study about how proxies work (nginx et al), CDNs and that kind of
 stuff, but if you want more specific answers, you could better ask specific
 question.
 
  In the PHP area, an opcode cache does the job very well and can
 accelerate the page load by several orders of magnitude, I recommend
 OPCache, which is already included in PHP 5.5.
 
  Greetings.
 
 
  --
  PHP General Mailing List (http://www.php.net/)
  To unsubscribe, visit: http://www.php.net/unsub.php
 


 The original RFC states:

 https://wiki.php.net/rfc/optimizerplus
 The integration proposed for PHP 5.5.0 is mostly 'soft' integration. That
 means that there'll be no tight coupling between Optimizer+ and PHP; Those
 who wish to use another opcode cache will be able to do so, by not loading
 Optimizer+ and loading another opcode cache instead. As per the Suggested
 Roadmap above, we might want to review this decision in the future; There
 might be room for further performance or functionality gains from tighter
 integration; None are known at this point, and they're beyond the scope of
 this RFC.

 So that's why OPCache isn't enabled by default in PHP 5.5



Also worth to mention, that it is the first release with an opcode-cache
integrated. Giving the other some release to get used to it, sounds useful
:)



 Greetings.


 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php




-- 
github.com/KingCrunch