Hi Tim

On Wed, Jun 2, 2021 at 12:38 PM Tim Düsterhus <[email protected]> wrote:

> Elias,
>
> On 6/2/21 12:16 PM, Elias Abacioglu wrote:
> > I'm planning on placing an API behind HAProxy and want a way to modify
> the
> > times/intervals in the POST data to increase cacheability to the API.
>
> While POST requests may in fact be cached under certain circumstances as
> I learned 2 minutes ago, it is certainly very unusual and also not
> supported by HAProxy's cache.
>

Oh, that was bad news that haproxy cannot cache POST requests.
I guess I can use nuster (https://github.com/jiangwenyuan/nuster), a
HAProxy fork that adds more caching support (seems to be able to cache POST
requests).
Otherwise I would have to use Nginx or OpenResty, however I do prefer
HAProxy.

I would advise against caching POST requests and instead move the
> parameters into the URL or a request header (make sure to set 'Vary' on
> the response) and instead use a GET request.
>

It makes perfect sense to cache POSTS in certain scenarios, for example,
Grafana speaks with POST requests to OpenTSDB. Also Druid uses POST
requests for querying, and a bunch of other time series databases. I can't
really control that some third party tools have opted for POST instead of
URL parameters or headers.
Sending a POST request that is repeated, for instance a user pressing F5 in
their browser a bunch of times, can cause an underlying API to do some
heavy processing repeatedly. Some of these applications might have some
query cache, but not always optimal. I would rather just avoid talking to
the API in a situation like this and just presenting a response from a
cache.

However I still want to know if it's possible to transform a POST's json
data as I mentioned in my original post (even if I cannot cache the
response).

Thanks
Elias

Reply via email to