Hello,

since I have a similar problem, I'll add myself to this post. Maybe it will 
also help the creator.

The start of haproxy-portal fails with the following error:

-- Unit packetfence-haproxy-portal.service has begun starting up.
Sep 07 07:14:12 pir-nac03 packetfence[1016]: -e(1016) WARN: requesting member 
ips for an undefined interface... (pf::cluster::members_ips)
Sep 07 07:14:12 pir-nac03 packetfence[1016]: -e(1016) WARN: requesting member 
ips for an undefined interface... (pf::cluster::members_ips)
Sep 07 07:14:12 pir-nac03 pfhttpd[31992]: api-frontend-access 127.0.0.1 - - 
[07/Sep/2021:07:14:12 +0200] "GET /api/v1/queues/stats HTTP/1.1" 200 1123 
"https://192.168.9.183:1443/admin?";
Sep 07 07:14:12 pir-nac03 haproxy[32595]: 192.168.8.15:62612 
[07/Sep/2021:07:14:12.725] admin-https-192.168.8.2~ api/127.0.0.1 0/0/0/6/7 200 
1295 - - ---- 1/1/0/0/0 0/0 {192.168.9.183:1
Sep 07 07:14:12 pir-nac03 haproxy[1019]: [ALERT] 249/071412 (1019) : Parsing 
[/usr/local/pf/var/conf/haproxy-portal.conf:122]: frontend 
'portal-http-192.168.203.1' has the same name as
Sep 07 07:14:12 pir-nac03 haproxy[1019]: [ALERT] 249/071412 (1019) : parsing 
[/usr/local/pf/var/conf/haproxy-portal.conf:125] : stick-table name 
'portal-http-192.168.203.1' conflicts wi
Sep 07 07:14:12 pir-nac03 haproxy[1019]: [ALERT] 249/071412 (1019) : Parsing 
[/usr/local/pf/var/conf/haproxy-portal.conf:140]: frontend 
'portal-https-192.168.203.1' has the same name as
Sep 07 07:14:12 pir-nac03 haproxy[1019]: [ALERT] 249/071412 (1019) : parsing 
[/usr/local/pf/var/conf/haproxy-portal.conf:143] : stick-table name 
'portal-https-192.168.203.1' conflicts w
Sep 07 07:14:12 pir-nac03 haproxy[1019]: [ALERT] 249/071412 (1019) : Error(s) 
found in configuration file : /usr/local/pf/var/conf/haproxy-portal.conf
Sep 07 07:14:12 pir-nac03 haproxy[1019]: [ALERT] 249/071412 (1019) : Fatal 
errors found in configuration.
Sep 07 07:14:12 pir-nac03 systemd[1]: packetfence-haproxy-portal.service: Main 
process exited, code=exited, status=1/FAILURE
Sep 07 07:14:12 pir-nac03 systemd[1]: Failed to start PacketFence HAProxy Load 
Balancer for the captive portal.
-- Subject: Unit packetfence-haproxy-portal.service has failed

checking the configuration results in the following:

haproxy -c -V -f /usr/local/pf/var/conf/haproxy-portal.conf
[ALERT] 250/074126 (24684) : Parsing 
[/usr/local/pf/var/conf/haproxy-portal.conf:122]: frontend 
'portal-http-192.168.203.1' has the same name as frontend 
'portal-http-192.168.203.1' declared at 
/usr/local/pf/var/conf/haproxy-portal.conf:70.
[ALERT] 250/074126 (24684) : parsing 
[/usr/local/pf/var/conf/haproxy-portal.conf:125] : stick-table name 
'portal-http-192.168.203.1' conflicts with table declared in frontend 
'portal-http-192.168.203.1' at /usr/local/pf/var/conf/haproxy-portal.conf:70.
[ALERT] 250/074126 (24684) : Parsing 
[/usr/local/pf/var/conf/haproxy-portal.conf:140]: frontend 
'portal-https-192.168.203.1' has the same name as frontend 
'portal-https-192.168.203.1' declared at 
/usr/local/pf/var/conf/haproxy-portal.conf:88.
[ALERT] 250/074126 (24684) : parsing 
[/usr/local/pf/var/conf/haproxy-portal.conf:143] : stick-table name 
'portal-https-192.168.203.1' conflicts with table declared in frontend 
'portal-https-192.168.203.1' at /usr/local/pf/var/conf/haproxy-portal.conf:88.
[ALERT] 250/074126 (24684) : Error(s) found in configuration file : 
/usr/local/pf/var/conf/haproxy-portal.conf
[ALERT] 250/074126 (24684) : Fatal errors found in configuration.

It is strange that there are duplicate entries of the frontend type in 
haproxy-portal.conf and I don't know where they come from or how I can delete 
them again. Every change to the file is deleted after a restart.
>>>>
cat /usr/local/pf/var/conf/haproxy-portal.conf
# This file is generated from a template at 
/usr/local/pf/conf/haproxy-portal.conf
# Any changes made to this file will be lost on restart

# Copyright (C) Inverse inc.
global
  external-check
  user haproxy
        group haproxy
        daemon
        pidfile /usr/local/pf/var/run/haproxy-portal.pid
        log /dev/log local0
        stats socket /usr/local/pf/var/run/haproxy-portal.stats level admin 
process 1
        maxconn 4000
        #Followup of https://github.com/inverse-inc/packetfence/pull/893
        #haproxy 1.6.11 | intermediate profile | OpenSSL 1.0.1e | SRC: 
https://mozilla.github.io/server-side-tls/ssl-config-generator/?server=haproxy-1.6.11&openssl=1.0.1e&hsts=yes&profile=intermediate
        #Oldest compatible clients: Firefox 1, Chrome 1, IE 7, Opera 5, Safari 
1, Windows XP IE8, Android 2.3, Java 7
        tune.ssl.default-dh-param 2048
        ssl-default-bind-ciphers 
ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA:ECDHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA:ECDHE-ECDSA-DES-CBC3-SHA:ECDHE-RSA-DES-CBC3-SHA:EDH-RSA-DES-CBC3-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:DES-CBC3-SHA:!DSS
        ssl-default-bind-options no-sslv3 no-tls-tickets
        ssl-default-server-ciphers 
ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA:ECDHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA:ECDHE-ECDSA-DES-CBC3-SHA:ECDHE-RSA-DES-CBC3-SHA:EDH-RSA-DES-CBC3-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:DES-CBC3-SHA:!DSS
        ssl-default-server-options no-sslv3 no-tls-tickets
        #OLD SSL CONFIGURATION. IF RC4 is required or if you must support 
clients older then the precendent list, comment all the block between this 
comment and the precedent and uncomment the following line
        #ssl-default-bind-ciphers 
ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-DSS-AES128-GCM-SHA256:kEDH+AESGCM:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-SHA:ECDHE-ECDSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-DSS-AES128-SHA256:DHE-RSA-AES256-SHA256:DHE-DSS-AES256-SHA:DHE-RSA-AES256-SHA:ECDHE-RSA-DES-CBC3-SHA:ECDHE-ECDSA-DES-CBC3-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:AES:CAMELLIA:DES-CBC3-SHA:!aNULL:!eNULL:!EXPORT:!DES:!RC4:!MD5:!PSK:!aECDH:!EDH-DSS-DES-CBC3-SHA:!EDH-RSA-DES-CBC3-SHA:!KRB5-DES-CBC3-SHA
        lua-load /usr/local/pf/var/conf/passthrough.lua

listen stats
  bind  192.168.8.2:1025
  mode http
  timeout connect 10s
  timeout client 1m
  timeout server 1m
  stats enable
  stats uri /stats
  stats realm HAProxy\ Statistics
  stats auth admin:packetfence


defaults
        log     global
        mode    http
        option  httplog
        option  dontlognull
        timeout connect 5000
        timeout client 50000
        timeout server 50000
        errorfile 403 
/usr/local/pf/html/captive-portal/templates/rate-limiting.http

backend proxy
    option httpclose
    option http_proxy
    option forwardfor
    # Need to have a proxy listening on localhost port 8888
    acl paramsquery query -m found
    http-request set-uri http://127.0.0.1:8888%[path]?%[query] if paramsquery
    http-request set-uri http://127.0.0.1:8888%[path] unless paramsquery

backend static
    option httpclose
    option http_proxy
    option forwardfor
    http-request set-uri http://127.0.0.1:8889%[path]?%[query]

backend scep
    option httpclose
    option http_proxy
    option forwardfor
    http-request set-uri http://127.0.0.1:22225/api/v1%[path]?%[query]


frontend portal-http-192.168.203.1
        bind 192.168.203.1:80
        capture request header Host len 40
        stick-table type ip size 1m expire 10s store gpc0,http_req_rate(10s)
        tcp-request connection track-sc1 src
        http-request lua.change_host
        acl host_exist var(req.host) -m found
        http-request set-header Host %[var(req.host)] if host_exist
        http-request lua.select
        acl action var(req.action) -m found
        acl unflag_abuser src_clr_gpc0 --
        http-request allow if action unflag_abuser
        http-request deny if { src_get_gpc0 gt 0 }
        reqadd X-Forwarded-Proto:\ http
        use_backend %[var(req.action)]
        default_backend 192.168.203.1-backend


frontend portal-https-192.168.203.1
        bind 192.168.203.1:443 ssl no-sslv3 crt 
/usr/local/pf/conf/ssl/server.pem
        capture request header Host len 40
        stick-table type ip size 1m expire 10s store gpc0,http_req_rate(10s)
        tcp-request connection track-sc1 src
        http-request lua.change_host
        acl host_exist var(req.host) -m found
        http-request set-header Host %[var(req.host)] if host_exist
        http-request lua.select
        acl action var(req.action) -m found
        acl unflag_abuser src_clr_gpc0 --
        http-request allow if action unflag_abuser
        http-request deny if { src_get_gpc0 gt 0 }
        reqadd X-Forwarded-Proto:\ https
        use_backend %[var(req.action)]
        default_backend 192.168.203.1-backend



backend 192.168.203.1-backend
        balance source
        option httpchk GET /captive-portal HTTP/1.0\r\nUser-agent:\ 
HAPROXY-load-balancing-check
        default-server inter 5s fall 3 rise 2
        option httpclose
        option forwardfor
        acl status_501 status 501
        acl abuse  src_http_req_rate(portal-http-192.168.203.1) ge 48
        acl flag_abuser src_inc_gpc0(portal-http-192.168.203.1) --
        acl abuse  src_http_req_rate(portal-https-192.168.203.1) ge 48
        acl flag_abuser src_inc_gpc0(portal-https-192.168.203.1) --
        http-response deny if abuse status_501 flag_abuser
        server 127.0.0.1 127.0.0.1:80 check inter 30s


frontend portal-http-192.168.203.1
        bind 192.168.203.1:80
        capture request header Host len 40
        stick-table type ip size 1m expire 10s store gpc0,http_req_rate(10s)
        tcp-request connection track-sc1 src
        http-request lua.change_host
        acl host_exist var(req.host) -m found
        http-request set-header Host %[var(req.host)] if host_exist
        http-request lua.select
        acl action var(req.action) -m found
        acl unflag_abuser src_clr_gpc0 --
        http-request allow if action unflag_abuser
        http-request deny if { src_get_gpc0 gt 0 }
        reqadd X-Forwarded-Proto:\ http
        use_backend %[var(req.action)]
        default_backend 192.168.203.1-backend


frontend portal-https-192.168.203.1
        bind 192.168.203.1:443 ssl no-sslv3 crt 
/usr/local/pf/conf/ssl/server.pem
        capture request header Host len 40
        stick-table type ip size 1m expire 10s store gpc0,http_req_rate(10s)
        tcp-request connection track-sc1 src
        http-request lua.change_host
        acl host_exist var(req.host) -m found
        http-request set-header Host %[var(req.host)] if host_exist
        http-request lua.select
        acl action var(req.action) -m found
        acl unflag_abuser src_clr_gpc0 --
        http-request allow if action unflag_abuser
        http-request deny if { src_get_gpc0 gt 0 }
        reqadd X-Forwarded-Proto:\ https
        use_backend %[var(req.action)]
        default_backend 192.168.203.1-backend
<<<<

here is my pf.conf
>>>>
[general]
domain=XXXXXXX.de
hostname=portal
timezone=Europe/Berlin
[network]
dhcp_process_ipv6=disabled
interfaceSNAT=ens34
[fencing]
passthrough=disabled
interception_proxy=enabled
[database]
pass=XXXXXXX
[services]
radiusd_acct=enabled
httpd_collector=enabled
[inline]
ports_redirect=80/tcp,443/tcp,8080/tcp
interfaceSNAT=ens34
[captive_portal]
ip_address=192.168.203.1
network_detection_ip=192.168.203.1
network_redirect_delay=20s
[advanced]
language=de_DE
update_iplog_with_accounting=enabled
multihost=enabled
configurator=disabled
netflow_on_all_networks=enabled
[radius_configuration]
record_accounting_in_sql=disabled
[dns_configuration]
record_dns_in_sql=enabled
[interface ens32]
type=management
ip=192.168.8.2
mask=255.255.255.224
[interface ens33]
type=internal
enforcement=inlinel2
mask=255.255.255.0
ip=192.168.203.1
[interface ens34]
type=other,dns
ipv6_address=2003:00d4:1f49:6600:020c:29ff:fe31:e3b7
mask=255.255.255.224
ip=192.168.8.34
ipv6_prefix=64
<<<<

How can I fix the error?

mit besten Grüßen

Ronald Zestermann
SB System/Netzwerk


-----Ursprüngliche Nachricht-----
Von: Zammit, Ludovic via PacketFence-users 
<packetfence-users@lists.sourceforge.net> 
Gesendet: Dienstag, 7. September 2021 14:34
An: packetfence-users@lists.sourceforge.net
Cc: Zammit, Ludovic <luza...@akamai.com>
Betreff: Re: [PacketFence-users] haproxy portal

Hello Christopher,

If you did not touch the haproxy proxy config files there is a good chance it’s 
a misconfiguration of the IPs on the interfaces.

Check you server IP config and then compare it to conf/pf.conf

If you do any modification in the conf/pf.conf don’t forget to force reload 
them with /usr/local/pf/bin/pfcmd configreload hard

Thanks!


Ludovic Zammit
Product Support Engineer Principal       
 
<https://www.akamai.com/us/en/multimedia/images/custom/2019/logo-no-tag-93x45.png>
     
        
Cell: +1.613.670.8432
        Akamai Technologies - Inverse
145 Broadway
Cambridge, MA 02142
        
        
Connect with Us:          <https://community.akamai.com>   
<http://blogs.akamai.com>   <https://twitter.com/akamai>   
<http://www.facebook.com/AkamaiTechnologies>   
<http://www.linkedin.com/company/akamai-technologies>   
<http://www.youtube.com/user/akamaitechnologies?feature=results_main>      


        On Aug 30, 2021, at 7:58 PM, Christopher Gilimete via PacketFence-users 
<packetfence-users@lists.sourceforge.net 
<mailto:packetfence-users@lists.sourceforge.net> > wrote:

        Please help me with my setup, I can not start the service haproxy 
portal. It is stuck trying to start the service. Here is the message

        Aug 31 10:56:04 covid systemd: Unit packetfence-haproxy-portal.service 
entered failed state.
        Aug 31 10:56:04 covid systemd: packetfence-haproxy-portal.service 
failed.
        Aug 31 10:56:04 covid systemd: packetfence-haproxy-portal.service 
holdoff time over, scheduling restart.
        Aug 31 10:56:04 covid systemd: Stopped PacketFence HAProxy Load 
Balancer for the captive portal.
        Aug 31 10:56:04 covid systemd: Starting PacketFence HAProxy Load 
Balancer for the captive portal...
        Aug 31 10:56:08 covid haproxy: [ALERT] 242/105608 (6806) : Error(s) 
found in configuration file : /usr/local/pf/var/conf/haproxy-portal.conf
        Aug 31 10:56:08 covid haproxy: [ALERT] 242/105608 (6806) : Fatal errors 
found in configuration.
        Aug 31 10:56:08 covid systemd: packetfence-haproxy-portal.service: main 
process exited, code=exited, status=1/FAILURE
        Aug 31 10:56:08 covid systemd: Failed to start PacketFence HAProxy Load 
Balancer for the captive portal.
        Aug 31 10:56:08 covid systemd: Unit packetfence-haproxy-portal.service 
entered failed state.
        Aug 31 10:56:08 covid systemd: packetfence-haproxy-portal.service 
failed.
        Aug 31 10:56:08 covid systemd: packetfence-haproxy-portal.service 
holdoff time over, scheduling restart.
        Aug 31 10:56:08 covid systemd: Stopped PacketFence HAProxy Load 
Balancer for the captive portal.
        Aug 31 10:56:08 covid systemd: Starting PacketFence HAProxy Load 
Balancer for the captive portal...
        Aug 31 10:56:11 covid haproxy: [ALERT] 242/105611 (6818) : Error(s) 
found in configuration file : /usr/local/pf/var/conf/haproxy-portal.conf
        Aug 31 10:56:11 covid haproxy: [ALERT] 242/105611 (6818) : Fatal errors 
found in configuration.
        Aug 31 10:56:11 covid systemd: packetfence-haproxy-portal.service: main 
process exited, code=exited, status=1/FAILURE
        Aug 31 10:56:11 covid systemd: Failed to start PacketFence HAProxy Load 
Balancer for the captive portal.
        Aug 31 10:56:11 covid systemd: Unit packetfence-haproxy-portal.service 
entered failed state.
        Aug 31 10:56:11 covid systemd: packetfence-haproxy-portal.service 
failed.
        Aug 31 10:56:12 covid systemd: packetfence-haproxy-portal.service 
holdoff time over, scheduling restart.
        Aug 31 10:56:12 covid systemd: Stopped PacketFence HAProxy Load 
Balancer for the captive portal.
        Aug 31 10:56:12 covid systemd: Starting PacketFence HAProxy Load 
Balancer for the captive portal...
        
        _______________________________________________
        PacketFence-users mailing list
        PacketFence-users@lists.sourceforge.net 
<mailto:PacketFence-users@lists.sourceforge.net> 
        
https://urldefense.com/v3/__https://lists.sourceforge.net/lists/listinfo/packetfence-users__;!!GjvTz_vk!FI0PWsp2vZjplE_xGEeeJsorAHTtocDvn9KMVGvduhgqJOM7d91c4ZdXEQBDxM6X$
 
        



_______________________________________________
PacketFence-users mailing list
PacketFence-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/packetfence-users

Reply via email to