Re: [Openstack-operators] How are people dealing with API rate limiting?

2016-06-14 Thread John Dickinson
Cluster protection, mainly. Swift's rate limiting is based on write requests (eg PUT, POST, DELETE) per second per container. Since a large number of object writes in a single container could cause some background processes to back up and not service other requests, limiting the ops/sec to a

Re: [Openstack-operators] How are people dealing with API rate limiting?

2016-06-14 Thread Michael Richardson
On Tue, 14 Jun 2016 16:10:04 + "Kingshott, Daniel" wrote: > We use Haproxy to load balance API requests and applied rate limiting > there. +1. We've also had success with rate limiting via haproxy (largely related to Horizon and StackTask) with stick counters

Re: [Openstack-operators] How are people dealing with API rate limiting?

2016-06-14 Thread John Dickinson
Swift does rate limiting across the proxy servers ("api servers" in nava parlance) as described at http://docs.openstack.org/developer/swift/ratelimit.html. It uses a memcache pool to coordinate the rate limiting across proxy processes (local or across machines). Code's at

Re: [Openstack-operators] How are people dealing with API rate limiting?

2016-06-14 Thread Silence Dogood
+1 also SSL On Tue, Jun 14, 2016 at 4:58 PM, Russell Bryant wrote: > This is the most common approach I've heard of (doing rate limiting in > your load balancer). > > On Tue, Jun 14, 2016 at 12:10 PM, Kingshott, Daniel < > daniel.kingsh...@bestbuy.com> wrote: > >> We use

Re: [Openstack-operators] How are people dealing with API rate limiting?

2016-06-14 Thread Tim Bell
On 14/06/16 18:28, "Edgar Magana" wrote: >Second that one! Feels like one of the best options, we are moving towards >that direction. > >Edgar > For completeness, Rackspace had a project called Repose which did rate limiting. Core is at

Re: [Openstack-operators] How are people dealing with API rate limiting?

2016-06-14 Thread Edgar Magana
Second that one! Feels like one of the best options, we are moving towards that direction. Edgar On 6/14/16, 9:10 AM, "Kingshott, Daniel" wrote: >We use Haproxy to load balance API requests and applied rate limiting >there. > > > >On Tue, Jun 14, 2016 at 9:02 AM,

Re: [Openstack-operators] How are people dealing with API rate limiting?

2016-06-14 Thread Kingshott, Daniel
We use Haproxy to load balance API requests and applied rate limiting there. On Tue, Jun 14, 2016 at 9:02 AM, Matt Riedemann wrote: A question came up in the nova IRC channel this morning about the api_rate_limit config option in nova which was only for the v2 API.

Re: [Openstack-operators] How are people dealing with API rate limiting?

2016-06-14 Thread Kevin Bringard (kevinbri)
On 6/14/16, 9:44 AM, "Matt Fischer" wrote: >On Tue, Jun 14, 2016 at 9:37 AM, Sean Dague > wrote: > >On 06/14/2016 11:02 AM, Matt Riedemann wrote: >> A question came up in the nova IRC channel this morning about the >> api_rate_limit config option in nova

Re: [Openstack-operators] How are people dealing with API rate limiting?

2016-06-14 Thread Sean Dague
On 06/14/2016 11:02 AM, Matt Riedemann wrote: > A question came up in the nova IRC channel this morning about the > api_rate_limit config option in nova which was only for the v2 API. > > Sean Dague explained that it never really worked because it was per API > server so if you had more than one

Re: [Openstack-operators] How are people dealing with API rate limiting?

2016-06-14 Thread Alex Schultz
On Tue, Jun 14, 2016 at 9:02 AM, Matt Riedemann wrote: > A question came up in the nova IRC channel this morning about the > api_rate_limit config option in nova which was only for the v2 API. > > Sean Dague explained that it never really worked because it was per API