On 02/03/2019 14:04, Tamer Higazi wrote:
> 2. And with your comment "So please help to scale the application for
> concurrent users."
>
> is very unpolite.

I think it was just badly written english. I don't think they meant to
be unpolite.

On 02/03/2019 14:04, Tamer Higazi wrote:
> 1. Get yourself propper hardware, that would solve perhaps by 80% your
> problems.
>
> Here is a good starting point:
>
> http://nginx.org/en/docs/http/load_balancing.html
> https://uwsgi-docs.readthedocs.io/en/latest/Broodlord.html
> https://uwsgi-docs.readthedocs.io/en/latest/Fastrouter.html
>
> On 01.03.19 07:27, Ashraf Mohamed wrote:
>>
>> I have a flask application which is running in nginx server and i am
>> unbale to serve the application for more then 20 users (concurrently)
>> as its gets break.
>>
>> *_APP Architecture:_*
>> I have 2 application running on different servers app 1(using for
>> frontend ) and app 2( using for REST API Calls) both are flask
>> applications

I highly suspect the application is deadlocking on itself. I don't
recommend building a web application that keeps the HTTP stream open
while doing a long job. You should pass the job to a worker in backend
and answer to the client right away. Then the client should poll on
another endpoint regularly to check on what's the job status.

-- 
Cordially,
Léo
_______________________________________________
uWSGI mailing list
uWSGI@lists.unbit.it
http://lists.unbit.it/cgi-bin/mailman/listinfo/uwsgi

Reply via email to