Fabian Reiber created NIFI-14462:
------------------------------------

             Summary: Regular getting HTTP 502 Error behind Apache Reverse Proxy
                 Key: NIFI-14462
                 URL: https://issues.apache.org/jira/browse/NIFI-14462
             Project: Apache NiFi
          Issue Type: Bug
          Components: Core UI
    Affects Versions: 2.3.0
         Environment: Debian 11/12
Podman 5.3.0/5.4.2
Apache2: 2.4.62
            Reporter: Fabian Reiber


We are using nifi in a secured cluster witch three nodes behind an apache 
reverse proxy. After the update from 1.28.1 to 2.3.0, we regularly get 502 
proxy errors like this in the browser:
{code:none}
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>502 
Proxy Error</title> </head><body> <h1>Proxy Error</h1> <p>The proxy server 
received an invalid response from an upstream server.<br /> The proxy server 
could not handle the request<p>Reason: <strong>Error reading from remote 
server</strong></p></p> <hr> <address>Apache/2.4.62 (Debian) Server at 
nifi.local.example.com Port 443</address> </body></html> {code}
This did not happen with nifi 1. Nifi fetches various api endpoints while 
running. And each of the endpoint can throw this error. Most of the time the 
requests are ok and nifi runs as expected. But sometimes not. Here is an 
example of the apache error log, after we raise the log level to trace5:
{code:none}
[Thu Apr 10 14:58:01.694121 2025] [proxy:trace2] [pid 14770:tid 14793] 
mod_proxy.c(848): [client 192.168.58.1:38278] AH03461: attempting to match URI 
path '/nifi-api/flow/status' against prefix '/' for proxying, referer: 
https://nifi.local.example.com/nifi/
[Thu Apr 10 14:58:01.694125 2025] [proxy:debug] [pid 14770:tid 14784] 
mod_proxy.c(1465): [client 192.168.58.1:38262] AH01143: Running scheme https 
handler (attempt 0), referer: https://nifi.local.example.com/nifi/
[Thu Apr 10 14:58:01.694139 2025] [proxy:trace1] [pid 14770:tid 14793] 
mod_proxy.c(969): [client 192.168.58.1:38278] AH03464: URI path 
'/nifi-api/flow/status' matches proxy handler 
'proxy:https://nifi.local.example.com:8443/nifi-api/flow/status', referer: 
https://nifi.local.example.com/nifi/
[Thu Apr 10 14:58:01.694143 2025] [proxy:debug] [pid 14770:tid 14784] 
proxy_util.c(2797): AH00942: https: has acquired connection for 
(nifi.local.example.com:8443)
[Thu Apr 10 14:58:01.694149 2025] [proxy:debug] [pid 14770:tid 14784] 
proxy_util.c(3242): [client 192.168.58.1:38262] AH00944: connecting 
https://nifi.local.example.com:8443/nifi-api/flow/process-groups/1fb57701-0196-1000-705b-739fe62c8f28?uiOnly=true
 to nifi.local.example.com:8443, referer: https://nifi.local.example.com/nifi/
[Thu Apr 10 14:58:01.694161 2025] [proxy:debug] [pid 14770:tid 14784] 
proxy_util.c(3450): [client 192.168.58.1:38262] AH00947: connecting 
/nifi-api/flow/process-groups/1fb57701-0196-1000-705b-739fe62c8f28?uiOnly=true 
to 127.0.0.1:8443 (nifi.local.example.com:8443), referer: 
https://nifi.local.example.com/nifi/
[Thu Apr 10 14:58:01.694187 2025] [proxy:trace2] [pid 14770:tid 14784] 
proxy_util.c(3734): https: reusing backend connection 
127.0.0.1:42282<>127.0.0.1:8443
[Thu Apr 10 14:58:01.694279 2025] [proxy:trace2] [pid 14770:tid 14793] 
proxy_util.c(2604): [client 192.168.58.1:38278] https: found worker 
https://nifi.local.example.com:8443/ for 
https://nifi.local.example.com:8443/nifi-api/flow/status, referer: 
https://nifi.local.example.com/nifi/
[Thu Apr 10 14:58:01.694300 2025] [proxy:debug] [pid 14770:tid 14793] 
mod_proxy.c(1465): [client 192.168.58.1:38278] AH01143: Running scheme https 
handler (attempt 0), referer: https://nifi.local.example.com/nifi/
[Thu Apr 10 14:58:01.694328 2025] [proxy:debug] [pid 14770:tid 14793] 
proxy_util.c(2797): AH00942: https: has acquired connection for 
(nifi.local.example.com:8443)
[Thu Apr 10 14:58:01.694334 2025] [proxy:debug] [pid 14770:tid 14793] 
proxy_util.c(3242): [client 192.168.58.1:38278] AH00944: connecting 
https://nifi.local.example.com:8443/nifi-api/flow/status to 
nifi.local.example.com:8443, referer: https://nifi.local.example.com/nifi/
[Thu Apr 10 14:58:01.694340 2025] [proxy:debug] [pid 14770:tid 14793] 
proxy_util.c(3450): [client 192.168.58.1:38278] AH00947: connecting 
/nifi-api/flow/status to 127.0.0.1:8443 (nifi.local.example.com:8443), referer: 
https://nifi.local.example.com/nifi/
[Thu Apr 10 14:58:01.694745 2025] [proxy:trace2] [pid 14770:tid 14793] 
proxy_util.c(3734): https: reusing backend connection 
127.0.0.1:42272<>127.0.0.1:8443
[Thu Apr 10 14:58:01.694887 2025] [proxy_http:error] [pid 14770:tid 14793] 
(70014)End of file found: [client 192.168.58.1:38278] AH01102: error reading 
status line from remote server nifi.local.example.com:8443, referer: 
https://nifi.local.example.com/nifi/
[Thu Apr 10 14:58:01.694905 2025] [proxy:error] [pid 14770:tid 14793] [client 
192.168.58.1:38278] AH00898: Error reading from remote server returned by 
/nifi-api/flow/status, referer: https://nifi.local.example.com/nifi/{code}
Actually the last two lines are the most interesting, but don't give much 
information.

We can reproduce this behavior on two different clusters on different virtual 
machines with different debian and podman versions as well as with a secured 
single node instance from scratch. We are using podman to run nifi in a 
container/pod. We have our own certificate authority to configure the proxy 
with certificates. But I think this is not important in this case. So, the 
location directive of the apache config of the single node instance, which is 
more interesting I guess, looks like this:
{code:none}
<Location "/">
        ProxyPass        https://nifi.local.example.com:8443/
        ProxyPassReverse https://nifi.local.example.com:8443/
        RequestHeader add X-ProxyScheme "https"
        RequestHeader add X-ProxyHost "nifi.local.example.com"
        RequestHeader add X-ProxyPort "443"
        RequestHeader add X-ProxyContextPath "/"
</Location>{code}
This part does not differ a lot to the config of the webserver in our cluster.

But at least we found two workarounds. If we set "SetEnv 
proxy-initial-not-pooled 1" in the location directive of the apache config, we 
do not get any proxy error again and nifi works as expected. Another second 
possibility is by setting "disablereuse=On" in the "ProxyPass" directive.

The documentation of mod_proxy says about that variable:

??This avoids the "proxy: error reading status line from remote server" error 
message caused by the race condition that the backend server closed the pooled 
connection after the connection check by the proxy and before data sent by the 
proxy reached the backend.??

So, if no pooled connection is reused or not any connection at all is reused 
(with "disablereuse=On"), nifi does not close the connection to early to the 
proxy.

For more information take a look at the apache documentation for [mod_proxy 
Environmental 
variables|https://httpd.apache.org/docs/2.4/mod/mod_proxy_http.html#env] and 
[ProxyPass|https://httpd.apache.org/docs/2.4/mod/mod_proxy.html#ProxyPass]

If you need more information please let me know :)



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to