GNU/Linux - Ubuntu中的UFW介绍

UFW - Uncomplicated Firewall
Ubuntu 的默认防火墙配置工具是 ufw。ufw 是为简化 iptables 防火墙配置而开发的,它提供了一种用户友好的方式来创建基于 IPv4 或 IPv6 主机的防火墙。默认情况下,UFW 是禁用的。
The default firewall configuration tool for Ubuntu is ufw. Developed to ease iptables firewall configuration, ufw provides a user friendly way to create an IPv4 or IPv6 host-based firewall. By default UFW is disabled.
gufw 是一个图形用户界面前端。
gufw is a GUI that is available as a frontend.
1, Basic Syntax and Examples
Default rules are fine for the average home user /  默认规则对普通家庭用户来说没有问题
当你打开 UFW 时,它会使用一套默认的规则(配置文件),这对普通家庭用户来说应该没问题。这至少是 Ubuntu 开发人员的目标。简而言之,所有的 “传入 ”都会被拒绝,但也有一些例外情况,以方便家庭用户。
When you turn UFW on, it uses a default set of rules (profile) that should be fine for the average home user. That's at least the goal of the Ubuntu developers. In short, all 'incoming' is being denied, with some exceptions to make things easier for home users.
Enable and Disable
To turn UFW on with the default set of rules:
sudo ufw enable
To check the status of UFW:
sudo ufw status verbose
请注意,默认情况下,拒绝将应用于传入。但也有例外情况,可在该命令的输出中找到:
Note that by default, deny is being applied to incoming. There are exceptions, which can be found in the output of this command:
sudo ufw show raw
你还可以读取 /etc/ufw 中的规则文件(文件名以 .rules 结尾的文件)。
You can also read the rules files in /etc/ufw (the files whose names end with .rules).
Disable UFW
To disable ufw use:
sudo ufw disable
Allow and Deny (specific rules)
Allow
sudo ufw allow <port>/<optional: protocol>
example: To allow incoming tcp and udp packet on port 53
sudo ufw allow 53
example: To allow incoming tcp packets on port 53
sudo ufw allow 53/tcp
example: To allow incoming udp packets on port 53
sudo ufw allow 53/udp
Deny
sudo ufw deny <port>/<optional: protocol>
example: To deny tcp and udp packets on port 53
sudo ufw deny 53
example: To deny incoming tcp packets on port 53
sudo ufw deny 53/tcp
example: To deny incoming udp packets on port 53
sudo ufw deny 53/udp
Delete Existing Rule
要删除一条规则,只需在原规则前加上删除即可。例如,如果原来的规则是:
To delete a rule, simply prefix the original rule with delete. For example, if the original rule was:
ufw deny 80/tcp
Use this to delete it:
sudo ufw delete deny 80/tcp
Services
你还可以通过服务名称来允许或拒绝,因为 ufw 会读取 /etc/services 以获取服务列表:
You can also allow or deny by service name since ufw reads from /etc/services to see get a list of services:
less /etc/services
Allow by Service Name
sudo ufw allow <service name>
example: to allow ssh by name
sudo ufw allow ssh
Deny by Service Name
example: to deny ssh by name
sudo ufw deny ssh
Status
检查 ufw 的状态会告诉你 ufw 是启用还是禁用,还会列出当前应用于 iptables 的 ufw 规则。
Checking the status of ufw will tell you if ufw is enabled or disabled and also list the current ufw rules that are applied to your iptables.
To check the status of ufw:
sudo ufw status
if ufw was not enabled the output would be:
sudo ufw status
Status: inactive
Logging
To enable logging use:
sudo ufw logging on
To disable logging use:
sudo ufw logging off
Check UFW Logs:
sudo less /var/log/ufw.log
or
Using nodes: slurm-gb200-218-[145,147,149,151,253,255],slurm-gb200-219-[001,003] pyxis: imported docker image: ghcr.io#coreweave/nccl-tests:12.8.1-devel-ubuntu22.04-nccl2.26.2-1-0708d2e pyxis: imported docker image: ghcr.io#coreweave/nccl-tests:12.8.1-devel-ubuntu22.04-nccl2.26.2-1-0708d2e pyxis: imported docker image: ghcr.io#coreweave/nccl-tests:12.8.1-devel-ubuntu22.04-nccl2.26.2-1-0708d2e pyxis: imported docker image: ghcr.io#coreweave/nccl-tests:12.8.1-devel-ubuntu22.04-nccl2.26.2-1-0708d2e pyxis: imported docker image: ghcr.io#coreweave/nccl-tests:12.8.1-devel-ubuntu22.04-nccl2.26.2-1-0708d2e pyxis: imported docker image: ghcr.io#coreweave/nccl-tests:12.8.1-devel-ubuntu22.04-nccl2.26.2-1-0708d2e pyxis: imported docker image: ghcr.io#coreweave/nccl-tests:12.8.1-devel-ubuntu22.04-nccl2.26.2-1-0708d2e pyxis: imported docker image: ghcr.io#coreweave/nccl-tests:12.8.1-devel-ubuntu22.04-nccl2.26.2-1-0708d2e [1752160265.696087] [slurm-gb200-219-003:95563:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [1752160265.696023] [slurm-gb200-218-255:103525:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [1752160265.696113] [slurm-gb200-219-003:95561:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-219-003:95561] pml_ucx.c:424 Error: ucp_ep_create(proc=12) failed: Destination is unreachable [slurm-gb200-219-003:95561] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 12 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [1752160265.696047] [slurm-gb200-218-255:103523:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-218-255:103523] pml_ucx.c:424 Error: ucp_ep_create(proc=4) failed: Destination is unreachable [slurm-gb200-218-255:103523] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 4 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [1752160265.696571] [slurm-gb200-218-253:89460:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-218-253:89460] pml_ucx.c:424 Error: ucp_ep_create(proc=0) failed: Destination is unreachable [slurm-gb200-218-253:89460] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 0 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [1752160265.696861] [slurm-gb200-219-001:99789:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-219-001:99789] pml_ucx.c:424 Error: ucp_ep_create(proc=8) failed: Destination is unreachable [slurm-gb200-219-001:99789] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 8 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [1752160265.696119] [slurm-gb200-219-003:95562:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-219-003:95562] pml_ucx.c:424 Error: ucp_ep_create(proc=13) failed: Destination is unreachable [slurm-gb200-219-003:95562] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 13 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [1752160265.697575] [slurm-gb200-218-253:89463:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-218-253:89463] pml_ucx.c:424 Error: ucp_ep_create(proc=3) failed: Destination is unreachable [slurm-gb200-218-253:89463] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 3 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [1752160265.697815] [slurm-gb200-219-001:99792:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-219-003:95563] pml_ucx.c:424 Error: ucp_ep_create(proc=14) failed: Destination is unreachable [slurm-gb200-219-003:95563] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 14 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [1752160265.696210] [slurm-gb200-219-003:95564:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-218-255:103524] pml_ucx.c:424 Error: ucp_ep_create(proc=5) failed: Destination is unreachable [slurm-gb200-218-255:103524] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 5 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [1752160265.696171] [slurm-gb200-218-255:103526:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-219-001:99792] pml_ucx.c:424 Error: ucp_ep_create(proc=11) failed: Destination is unreachable [slurm-gb200-219-001:99792] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 11 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [1752160265.698460] [slurm-gb200-219-001:99791:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [1752160265.698161] [slurm-gb200-218-253:89462:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-219-003:95564] pml_ucx.c:424 Error: ucp_ep_create(proc=15) failed: Destination is unreachable [slurm-gb200-219-003:95564] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 15 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [1752160265.698521] [slurm-gb200-219-001:99790:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-218-255:103526] pml_ucx.c:424 Error: ucp_ep_create(proc=7) failed: Destination is unreachable [slurm-gb200-218-255:103526] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 7 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [slurm-gb200-219-001:99790] pml_ucx.c:424 Error: ucp_ep_create(proc=9) failed: Destination is unreachable [slurm-gb200-219-001:99790] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 9 [slurm-gb200-218-253:89461] pml_ucx.c:424 Error: ucp_ep_create(proc=1) failed: Destination is unreachable [slurm-gb200-218-253:89461] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 1 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [slurm-gb200-218-255:103525] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-219-001:99792] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-219-003:95561] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-255:103526] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-219-001:99790] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-253:89461] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-253:89462] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-219-003:95562] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-255:103523] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-253:89460] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-253:89463] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-255:103524] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-219-001:99791] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-219-001:99789] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-219-003:95563] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-219-003:95564] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [1752160265.743560] [slurm-gb200-218-145:100876:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [1752160265.743544] [slurm-gb200-218-145:100878:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [1752160265.743013] [slurm-gb200-218-151:110182:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-218-145:100876] pml_ucx.c:424 Error: ucp_ep_create(proc=17) failed: Destination is unreachable [slurm-gb200-218-145:100876] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 17 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [1752160265.743059] [slurm-gb200-218-151:110181:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-218-145:100878] pml_ucx.c:424 Error: ucp_ep_create(proc=19) failed: Destination is unreachable [slurm-gb200-218-145:100878] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 19 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [1752160265.744269] [slurm-gb200-218-149:114475:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-218-151:110182] pml_ucx.c:424 Error: ucp_ep_create(proc=31) failed: Destination is unreachable [slurm-gb200-218-151:110182] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 31 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [1752160265.743380] [slurm-gb200-218-147:116799:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-218-149:114477] pml_ucx.c:424 Error: ucp_ep_create(proc=27) failed: Destination is unreachable [slurm-gb200-218-149:114477] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 27 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [1752160265.743747] [slurm-gb200-218-145:100875:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-218-149:114475] pml_ucx.c:424 Error: ucp_ep_create(proc=25) failed: Destination is unreachable [slurm-gb200-218-149:114475] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 25 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [1752160265.743469] [slurm-gb200-218-147:116796:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-218-145:100875] pml_ucx.c:424 Error: ucp_ep_create(proc=16) failed: Destination is unreachable [slurm-gb200-218-145:100875] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 16 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [1752160265.743100] [slurm-gb200-218-151:110179:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-218-151:110179] pml_ucx.c:424 Error: ucp_ep_create(proc=28) failed: Destination is unreachable [slurm-gb200-218-151:110179] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 28 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [1752160265.744293] [slurm-gb200-218-149:114476:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-218-147:116797] pml_ucx.c:424 Error: ucp_ep_create(proc=21) failed: Destination is unreachable [slurm-gb200-218-147:116797] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 21 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [1752160265.743444] [slurm-gb200-218-147:116798:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-218-151:110181] pml_ucx.c:424 Error: ucp_ep_create(proc=30) failed: Destination is unreachable [slurm-gb200-218-151:110181] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 30 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [1752160265.744394] [slurm-gb200-218-149:114474:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-218-149:114476] pml_ucx.c:424 Error: ucp_ep_create(proc=26) failed: Destination is unreachable [slurm-gb200-218-149:114476] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 26 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [slurm-gb200-218-147:116798] pml_ucx.c:424 Error: ucp_ep_create(proc=22) failed: Destination is unreachable [slurm-gb200-218-147:116798] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 22 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [slurm-gb200-218-149:114474] pml_ucx.c:424 Error: ucp_ep_create(proc=24) failed: Destination is unreachable [slurm-gb200-218-149:114474] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 24 [slurm-gb200-218-147:116799] pml_ucx.c:424 Error: ucp_ep_create(proc=23) failed: Destination is unreachable [slurm-gb200-218-147:116799] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 23 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [slurm-gb200-218-147:116796] pml_ucx.c:424 Error: ucp_ep_create(proc=20) failed: Destination is unreachable [slurm-gb200-218-147:116796] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 20 [LOG_CAT_COMMPATTERNS] isend failed in comm_allreduce_pml at iterations 4 [LOG_CAT_P2P] hmca_bcol_ucx_p2p address preexchange allreduce failed [slurm-gb200-218-147:116797] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-149:114476] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-147:116796] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-151:110180] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-151:110182] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-149:114477] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-145:100877] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-145:100878] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-147:116799] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-149:114474] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-151:110181] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-151:110179] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-147:116798] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-149:114475] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-145:100876] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [slurm-gb200-218-145:100875] Error: coll_hcoll_module.c:310 - mca_coll_hcoll_comm_query() Hcol library init failed [1752160265.930628] [slurm-gb200-219-003:95564:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-219-003:95564] pml_ucx.c:424 Error: ucp_ep_create(proc=0) failed: Destination is unreachable [slurm-gb200-219-003:95564] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 0 [slurm-gb200-219-003:95564:0:95564] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x7b) [1752160265.974232] [slurm-gb200-218-151:110182:0] select.c:644 UCX ERROR no active messages transport to <no debug data>: self/memory - Destination is unreachable, sysv/memory - Destination is unreachable, posix/memory - Destination is unreachable, cuda_copy/cuda - no am bcopy, cuda_ipc/cuda - no am bcopy, rc_verbs/ibp0:1 - Destination is unreachable, ud_verbs/ibp0:1 - De [slurm-gb200-218-151:110182] pml_ucx.c:424 Error: ucp_ep_create(proc=16) failed: Destination is unreachable [slurm-gb200-218-151:110182] pml_ucx.c:477 Error: Failed to resolve UCX endpoint for rank 16 [slurm-gb200-218-151:110182:0:110182] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x7b) ==== backtrace (tid: 95564) ==== 0 0x000000000004b8d8 ompi_request_default_test_all() /opt/hpcx/sources/openmpi-gitclone/ompi/request/req_test.c:184 1 0x0000000000002610 oob_allgather_test() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/ucc/coll_ucc_module.c:182 2 0x000000000000cd10 ucc_core_addr_exchange() /build-result/src/hpcx-v2.22.1-gcc-doca_ofed-ubuntu22.04-cuda12-aarch64/ucc-4abdb985e3ff05922d8fd175ec1ad099a80e6514/src/core/ucc_context.c:461 3 0x000000000000d8f4 ucc_context_create_proc_info() /build-result/src/hpcx-v2.22.1-gcc-doca_ofed-ubuntu22.04-cuda12-aarch64/ucc-4abdb985e3ff05922d8fd175ec1ad099a80e6514/src/core/ucc_context.c:723 4 0x00000000000028f8 mca_coll_ucc_init_ctx() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/ucc/coll_ucc_module.c:302 5 0x000000000000429c mca_coll_ucc_comm_query() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/ucc/coll_ucc_module.c:488 6 0x000000000008038c query_2_0_0() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/base/coll_base_comm_select.c:540 7 0x000000000008038c query() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/base/coll_base_comm_select.c:523 8 0x000000000008038c check_one_component() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/base/coll_base_comm_select.c:486 9 0x000000000008038c check_components() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/base/coll_base_comm_select.c:406 10 0x0000000000080744 mca_coll_base_comm_select() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/base/coll_base_comm_select.c:114 11 0x00000000000c5750 ompi_mpi_init() /opt/hpcx/sources/openmpi-gitclone/ompi/runtime/ompi_mpi_init.c:958 12 0x000000000006eae8 PMPI_Init() /opt/hpcx/sources/openmpi-gitclone/ompi/mpi/c/profile/pinit.c:67 13 0x0000000000003104 main() /opt/nccl-tests/src/common.cu:840 14 0x00000000000273fc __libc_init_first() ???:0 15 0x00000000000274cc __libc_start_main() ???:0 16 0x0000000000005b70 _start() ???:0 ==== backtrace (tid: 95564) ==== 0 0x000000000004b8d8 ompi_request_default_test_all() /opt/hpcx/sources/openmpi-gitclone/ompi/request/req_test.c:184 1 0x0000000000002610 oob_allgather_test() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/ucc/coll_ucc_module.c:182 2 0x000000000000cd10 ucc_core_addr_exchange() /build-result/src/hpcx-v2.22.1-gcc-doca_ofed-ubuntu22.04-cuda12-aarch64/ucc-4abdb985e3ff05922d8fd175ec1ad099a80e6514/src/core/ucc_context.c:461 3 0x000000000000d8f4 ucc_context_create_proc_info() /build-result/src/hpcx-v2.22.1-gcc-doca_ofed-ubuntu22.04-cuda12-aarch64/ucc-4abdb985e3ff05922d8fd175ec1ad099a80e6514/src/core/ucc_context.c:723 4 0x00000000000028f8 mca_coll_ucc_init_ctx() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/ucc/coll_ucc_module.c:302 5 0x000000000000429c mca_coll_ucc_comm_query() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/ucc/coll_ucc_module.c:488 6 0x000000000008038c query_2_0_0() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/base/coll_base_comm_select.c:540 7 0x000000000008038c query() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/base/coll_base_comm_select.c:523 8 0x000000000008038c check_one_component() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/base/coll_base_comm_select.c:486 9 0x000000000008038c check_components() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/base/coll_base_comm_select.c:406 10 0x0000000000080744 mca_coll_base_comm_select() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/base/coll_base_comm_select.c:114 11 0x00000000000c5750 ompi_mpi_init() /opt/hpcx/sources/openmpi-gitclone/ompi/runtime/ompi_mpi_init.c:958 12 0x000000000006eae8 PMPI_Init() /opt/hpcx/sources/openmpi-gitclone/ompi/mpi/c/profile/pinit.c:67 13 0x0000000000003104 main() /opt/nccl-tests/src/common.cu:840 14 0x00000000000273fc __libc_init_first() ???:0 15 0x00000000000274cc __libc_start_main() ???:0 16 0x0000000000005b70 _start() ???:0 ================================= [slurm-gb200-219-003:95564] *** Process received signal *** [slurm-gb200-219-003:95564] Signal: Segmentation fault (11) [slurm-gb200-219-003:95564] Signal code: (-6) [slurm-gb200-219-003:95564] Failing at address: 0x7e90001754c [slurm-gb200-219-003:95564] [ 0] linux-vdso.so.1(__kernel_rt_sigreturn+0x0)[0xf3ede1d709d0] [slurm-gb200-219-003:95564] [ 1] /opt/hpcx/ompi/lib/libmpi.so.40(ompi_request_default_test_all+0x48)[0xf3ede1bab8d8] [slurm-gb200-219-003:95564] [ 2] /opt/hpcx/ompi/lib/openmpi/mca_coll_ucc.so(+0x2610)[0xf3edc68f2610] [slurm-gb200-219-003:95564] [ 3] /opt/hpcx/ucc/lib/libucc.so.1(ucc_core_addr_exchange+0x5c)[0xf3edc68acd10] [slurm-gb200-219-003:95564] [ 4] /opt/hpcx/ucc/lib/libucc.so.1(ucc_context_create_proc_info+0x7f0)[0xf3edc68ad8f4] [slurm-gb200-219-003:95564] [ 5] /opt/hpcx/ompi/lib/openmpi/mca_coll_ucc.so(+0x28f8)[0xf3edc68f28f8] [slurm-gb200-219-003:95564] [ 6] /opt/hpcx/ompi/lib/openmpi/mca_coll_ucc.so(mca_coll_ucc_comm_query+0x5c)[0xf3edc68f429c] [slurm-gb200-219-003:95564] [ 7] /opt/hpcx/ompi/lib/libmpi.so.40(+0x8038c)[0xf3ede1be038c] [slurm-gb200-219-003:95564] [ 8] /opt/hpcx/ompi/lib/libmpi.so.40(mca_coll_base_comm_select+0x64)[0xf3ede1be0744] [slurm-gb200-219-003:95564] [ 9] /opt/hpcx/ompi/lib/libmpi.so.40(ompi_mpi_init+0x1190)[0xf3ede1c25750] [slurm-gb200-219-003:95564] [10] /opt/hpcx/ompi/lib/libmpi.so.40(MPI_Init+0x78)[0xf3ede1bceae8] [slurm-gb200-219-003:95564] [11] /opt/nccl_tests/build/all_reduce_perf(+0x3104)[0xacaa30543104] [slurm-gb200-219-003:95564] [12] /usr/lib/aarch64-linux-gnu/libc.so.6(+0x273fc)[0xf3edd25073fc] [slurm-gb200-219-003:95564] [13] /usr/lib/aarch64-linux-gnu/libc.so.6(__libc_start_main+0x98)[0xf3edd25074cc] [slurm-gb200-219-003:95564] [14] /opt/nccl_tests/build/all_reduce_perf(+0x5b70)[0xacaa30545b70] [slurm-gb200-219-003:95564] *** End of error message *** ==== backtrace (tid: 110182) ==== 0 0x000000000004b8d8 ompi_request_default_test_all() /opt/hpcx/sources/openmpi-gitclone/ompi/request/req_test.c:184 1 0x0000000000002610 oob_allgather_test() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/ucc/coll_ucc_module.c:182 2 0x000000000000cd10 ucc_core_addr_exchange() /build-result/src/hpcx-v2.22.1-gcc-doca_ofed-ubuntu22.04-cuda12-aarch64/ucc-4abdb985e3ff05922d8fd175ec1ad099a80e6514/src/core/ucc_context.c:461 3 0x000000000000d8f4 ucc_context_create_proc_info() /build-result/src/hpcx-v2.22.1-gcc-doca_ofed-ubuntu22.04-cuda12-aarch64/ucc-4abdb985e3ff05922d8fd175ec1ad099a80e6514/src/core/ucc_context.c:723 4 0x00000000000028f8 mca_coll_ucc_init_ctx() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/ucc/coll_ucc_module.c:302 5 0x000000000000429c mca_coll_ucc_comm_query() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/ucc/coll_ucc_module.c:488 6 0x000000000008038c query_2_0_0() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/base/coll_base_comm_select.c:540 7 0x000000000008038c query() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/base/coll_base_comm_select.c:523 8 0x000000000008038c check_one_component() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/base/coll_base_comm_select.c:486 9 0x000000000008038c check_components() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/base/coll_base_comm_select.c:406 10 0x0000000000080744 mca_coll_base_comm_select() /opt/hpcx/sources/openmpi-gitclone/ompi/mca/coll/base/coll_base_comm_select.c:114 11 0x00000000000c5750 ompi_mpi_init() /opt/hpcx/sources/openmpi-gitclone/ompi/runtime/ompi_mpi_init.c:958 12 0x000000000006eae8 PMPI_Init() /opt/hpcx/sources/openmpi-gitclone/ompi/mpi/c/profile/pinit.c:67 13 0x0000000000003104 main() /opt/nccl-tests/src/common.cu:840 14 0x00000000000273fc __libc_init_first() ???:0 15 0x00000000000274cc __libc_start_main() ???:0 16 0x0000000000005b70 _start() ???:0 ================================= [slurm-gb200-218-151:110182] *** Process received signal *** [slurm-gb200-218-151:110182] Signal: Segmentation fault (11) [slurm-gb200-218-151:110182] Signal code: (-6) [slurm-gb200-218-151:110182] Failing at address: 0x7e90001ae66 [slurm-gb200-218-151:110182] [ 0] linux-vdso.so.1(__kernel_rt_sigreturn+0x0)[0xf7d7771209d0] [slurm-gb200-218-151:110182] [ 1] /opt/hpcx/ompi/lib/libmpi.so.40(ompi_request_default_test_all+0x48)[0xf7d776f5b8d8] [slurm-gb200-218-151:110182] [ 2] /opt/hpcx/ompi/lib/openmpi/mca_coll_ucc.so(+0x2610)[0xf7d7538a2610] [slurm-gb200-218-151:110182] [ 3] /opt/hpcx/ucc/lib/libucc.so.1(ucc_core_addr_exchange+0x5c)[0xf7d75385cd10] [slurm-gb200-218-151:110182] [ 4] /opt/hpcx/ucc/lib/libucc.so.1(ucc_context_create_proc_info+0x7f0)[0xf7d75385d8f4] [slurm-gb200-218-151:110182] [ 5] /opt/hpcx/ompi/lib/openmpi/mca_coll_ucc.so(+0x28f8)[0xf7d7538a28f8] [slurm-gb200-218-151:110182] [ 6] /opt/hpcx/ompi/lib/openmpi/mca_coll_ucc.so(mca_coll_ucc_comm_query+0x5c)[0xf7d7538a429c] [slurm-gb200-218-151:110182] [ 7] /opt/hpcx/ompi/lib/libmpi.so.40(+0x8038c)[0xf7d776f9038c] [slurm-gb200-218-151:110182] [ 8] /opt/hpcx/ompi/lib/libmpi.so.40(mca_coll_base_comm_select+0x64)[0xf7d776f90744] [slurm-gb200-218-151:110182] [ 9] /opt/hpcx/ompi/lib/libmpi.so.40(ompi_mpi_init+0x1190)[0xf7d776fd5750] [slurm-gb200-218-151:110182] [10] /opt/hpcx/ompi/lib/libmpi.so.40(MPI_Init+0x78)[0xf7d776f7eae8] [slurm-gb200-218-151:110182] [11] /opt/nccl_tests/build/all_reduce_perf(+0x3104)[0xbd73c6e53104] [slurm-gb200-218-151:110182] [12] /usr/lib/aarch64-linux-gnu/libc.so.6(+0x273fc)[0xf7d7678b73fc] [slurm-gb200-218-151:110182] [13] /usr/lib/aarch64-linux-gnu/libc.so.6(__libc_start_main+0x98)[0xf7d7678b74cc] [slurm-gb200-218-151:110182] [14] /opt/nccl_tests/build/all_reduce_perf(+0x5b70)[0xbd73c6e55b70] [slurm-gb200-218-151:110182] *** End of error message *** srun: error: slurm-gb200-219-003: task 31: Segmentation fault srun: error: slurm-gb200-218-151: task 15: Segmentation fault
07-12
[root@localhost sites-enabled]# [root@localhost sites-enabled]# [root@localhost sites-enabled]# [root@localhost sites-enabled]# [root@localhost sites-enabled]# [root@localhost sites-enabled]# [root@localhost sites-enabled]# [root@localhost sites-enabled]# [root@localhost sites-enabled]# [root@localhost sites-enabled]# radiusd -fxx -l stdout FreeRADIUS Version 3.0.13 Copyright (C) 1999-2017 The FreeRADIUS server project and contributors There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE You may redistribute copies of FreeRADIUS under the terms of the GNU General Public License For more information about these matters, see the file named COPYRIGHT Starting - reading configuration files ... including dictionary file /usr/share/freeradius/dictionary including dictionary file /usr/share/freeradius/dictionary.dhcp including dictionary file /usr/share/freeradius/dictionary.vqp including dictionary file /etc/raddb/dictionary including configuration file /etc/raddb/radiusd.conf including configuration file /etc/raddb/proxy.conf including configuration file /etc/raddb/clients.conf including files in directory /etc/raddb/mods-enabled/ including configuration file /etc/raddb/mods-enabled/always including configuration file /etc/raddb/mods-enabled/attr_filter including configuration file /etc/raddb/mods-enabled/cache_eap including configuration file /etc/raddb/mods-enabled/chap including configuration file /etc/raddb/mods-enabled/date including configuration file /etc/raddb/mods-enabled/detail including configuration file /etc/raddb/mods-enabled/detail.log including configuration file /etc/raddb/mods-enabled/dhcp including configuration file /etc/raddb/mods-enabled/digest including configuration file /etc/raddb/mods-enabled/dynamic_clients including configuration file /etc/raddb/mods-enabled/eap including configuration file /etc/raddb/mods-enabled/echo including configuration file /etc/raddb/mods-enabled/exec including configuration file /etc/raddb/mods-enabled/expiration including configuration file /etc/raddb/mods-enabled/expr including configuration file /etc/raddb/mods-enabled/files including configuration file /etc/raddb/mods-enabled/linelog including configuration file /etc/raddb/mods-enabled/logintime including configuration file /etc/raddb/mods-enabled/mschap including configuration file /etc/raddb/mods-enabled/ntlm_auth including configuration file /etc/raddb/mods-enabled/pap including configuration file /etc/raddb/mods-enabled/passwd including configuration file /etc/raddb/mods-enabled/preprocess including configuration file /etc/raddb/mods-enabled/radutmp including configuration file /etc/raddb/mods-enabled/realm including configuration file /etc/raddb/mods-enabled/replicate including configuration file /etc/raddb/mods-enabled/soh including configuration file /etc/raddb/mods-enabled/sradutmp including configuration file /etc/raddb/mods-enabled/unix including configuration file /etc/raddb/mods-enabled/unpack including configuration file /etc/raddb/mods-enabled/utf8 including configuration file /etc/raddb/mods-enabled/sql including configuration file /etc/raddb/mods-config/sql/main/mysql/queries.conf including files in directory /etc/raddb/policy.d/ including configuration file /etc/raddb/policy.d/accounting including configuration file /etc/raddb/policy.d/canonicalization including configuration file /etc/raddb/policy.d/control including configuration file /etc/raddb/policy.d/cui including configuration file /etc/raddb/policy.d/debug including configuration file /etc/raddb/policy.d/dhcp including configuration file /etc/raddb/policy.d/eap including configuration file /etc/raddb/policy.d/filter including configuration file /etc/raddb/policy.d/operator-name including files in directory /etc/raddb/sites-enabled/ including configuration file /etc/raddb/sites-enabled/default including configuration file /etc/raddb/sites-enabled/inner-tunnel including configuration file /etc/raddb/sites-enabled/tls main { security { user = "radiusd" group = "radiusd" allow_core_dumps = no } name = "radiusd" prefix = "/usr" localstatedir = "/var" logdir = "/var/log/radius" run_dir = "/var/run/radiusd" } main { name = "radiusd" prefix = "/usr" localstatedir = "/var" sbindir = "/usr/sbin" logdir = "/var/log/radius" run_dir = "/var/run/radiusd" libdir = "/usr/lib64/freeradius" radacctdir = "/var/log/radius/radacct" hostname_lookups = no max_request_time = 30 cleanup_delay = 5 max_requests = 16384 pidfile = "/var/run/radiusd/radiusd.pid" checkrad = "/usr/sbin/checkrad" debug_level = 0 proxy_requests = yes log { stripped_names = no auth = no auth_badpass = no auth_goodpass = no colourise = yes msg_denied = "You are already logged in - access denied" } resources { } security { max_attributes = 200 reject_delay = 1.000000 status_server = yes } } radiusd: #### Loading Realms and Home Servers #### proxy server { retry_delay = 5 retry_count = 3 default_fallback = no dead_time = 120 wake_all_if_all_dead = no } home_server localhost { ipaddr = 127.0.0.1 port = 1812 type = "auth" secret = <<< secret >>> response_window = 20.000000 response_timeouts = 1 max_outstanding = 65536 zombie_period = 40 status_check = "status-server" ping_interval = 30 check_interval = 30 check_timeout = 4 num_answers_to_alive = 3 revive_interval = 120 limit { max_connections = 16 max_requests = 0 lifetime = 0 idle_timeout = 0 } coa { irt = 2 mrt = 16 mrc = 5 mrd = 30 } } home_server tls { ipaddr = 192.168.137.7 port = 2083 type = "auth" proto = "tcp" secret = <<< secret >>> response_window = 30.000000 response_timeouts = 1 max_outstanding = 65536 zombie_period = 40 status_check = "none" ping_interval = 30 check_timeout = 4 num_answers_to_alive = 3 revive_interval = 300 limit { max_connections = 16 max_requests = 0 lifetime = 0 idle_timeout = 0 } coa { irt = 2 mrt = 16 mrc = 5 mrd = 30 } } tls { verify_depth = 0 ca_path = "/etc/raddb/certs" pem_file_type = yes private_key_file = "/etc/raddb/certs/server.pem" certificate_file = "/etc/raddb/certs/server.pem" ca_file = "/etc/raddb/certs/ca.pem" private_key_password = <<< secret >>> dh_file = "/etc/raddb/certs/dh" random_file = "/dev/urandom" fragment_size = 8192 include_length = yes check_crl = no cipher_list = "DEFAULT" ecdh_curve = "prime256v1" } home_server_pool my_auth_failover { type = fail-over home_server = localhost } realm example.com { auth_pool = my_auth_failover } realm LOCAL { } home_server_pool tls { type = fail-over home_server = tls } realm tls { auth_pool = tls } radiusd: #### Loading Clients #### client all_client { ipaddr = 0.0.0.0/0 require_message_authenticator = no secret = <<< secret >>> limit { max_connections = 16 lifetime = 0 idle_timeout = 30 } } client localhost { ipaddr = 127.0.0.1 require_message_authenticator = no secret = <<< secret >>> nas_type = "other" proto = "*" limit { max_connections = 16 lifetime = 0 idle_timeout = 30 } } client localhost_ipv6 { ipv6addr = ::1 require_message_authenticator = no secret = <<< secret >>> limit { max_connections = 16 lifetime = 0 idle_timeout = 30 } } Debugger not attached # Creating Auth-Type = mschap # Creating Auth-Type = digest # Creating Auth-Type = eap # Creating Auth-Type = PAP # Creating Auth-Type = CHAP # Creating Auth-Type = MS-CHAP radiusd: #### Instantiating modules #### modules { # Loaded module rlm_always # Loading module "reject" from file /etc/raddb/mods-enabled/always always reject { rcode = "reject" simulcount = 0 mpp = no } # Loading module "fail" from file /etc/raddb/mods-enabled/always always fail { rcode = "fail" simulcount = 0 mpp = no } # Loading module "ok" from file /etc/raddb/mods-enabled/always always ok { rcode = "ok" simulcount = 0 mpp = no } # Loading module "handled" from file /etc/raddb/mods-enabled/always always handled { rcode = "handled" simulcount = 0 mpp = no } # Loading module "invalid" from file /etc/raddb/mods-enabled/always always invalid { rcode = "invalid" simulcount = 0 mpp = no } # Loading module "userlock" from file /etc/raddb/mods-enabled/always always userlock { rcode = "userlock" simulcount = 0 mpp = no } # Loading module "notfound" from file /etc/raddb/mods-enabled/always always notfound { rcode = "notfound" simulcount = 0 mpp = no } # Loading module "noop" from file /etc/raddb/mods-enabled/always always noop { rcode = "noop" simulcount = 0 mpp = no } # Loading module "updated" from file /etc/raddb/mods-enabled/always always updated { rcode = "updated" simulcount = 0 mpp = no } # Loaded module rlm_attr_filter # Loading module "attr_filter.post-proxy" from file /etc/raddb/mods-enabled/attr_filter attr_filter attr_filter.post-proxy { filename = "/etc/raddb/mods-config/attr_filter/post-proxy" key = "%{Realm}" relaxed = no } # Loading module "attr_filter.pre-proxy" from file /etc/raddb/mods-enabled/attr_filter attr_filter attr_filter.pre-proxy { filename = "/etc/raddb/mods-config/attr_filter/pre-proxy" key = "%{Realm}" relaxed = no } # Loading module "attr_filter.access_reject" from file /etc/raddb/mods-enabled/attr_filter attr_filter attr_filter.access_reject { filename = "/etc/raddb/mods-config/attr_filter/access_reject" key = "%{User-Name}" relaxed = no } # Loading module "attr_filter.access_challenge" from file /etc/raddb/mods-enabled/attr_filter attr_filter attr_filter.access_challenge { filename = "/etc/raddb/mods-config/attr_filter/access_challenge" key = "%{User-Name}" relaxed = no } # Loading module "attr_filter.accounting_response" from file /etc/raddb/mods-enabled/attr_filter attr_filter attr_filter.accounting_response { filename = "/etc/raddb/mods-config/attr_filter/accounting_response" key = "%{User-Name}" relaxed = no } # Loaded module rlm_cache # Loading module "cache_eap" from file /etc/raddb/mods-enabled/cache_eap cache cache_eap { driver = "rlm_cache_rbtree" key = "%{%{control:State}:-%{%{reply:State}:-%{State}}}" ttl = 15 max_entries = 0 epoch = 0 add_stats = no } # Loaded module rlm_chap # Loading module "chap" from file /etc/raddb/mods-enabled/chap # Loaded module rlm_date # Loading module "date" from file /etc/raddb/mods-enabled/date date { format = "%b %e %Y %H:%M:%S %Z" } # Loaded module rlm_detail # Loading module "detail" from file /etc/raddb/mods-enabled/detail detail { filename = "/var/log/radius/radacct/%{%{Packet-Src-IP-Address}:-%{Packet-Src-IPv6-Address}}/detail-%Y%m%d" header = "%t" permissions = 384 locking = no escape_filenames = no log_packet_header = no } # Loading module "auth_log" from file /etc/raddb/mods-enabled/detail.log detail auth_log { filename = "/var/log/radius/radacct/%{%{Packet-Src-IP-Address}:-%{Packet-Src-IPv6-Address}}/auth-detail-%Y%m%d" header = "%t" permissions = 384 locking = no escape_filenames = no log_packet_header = no } # Loading module "reply_log" from file /etc/raddb/mods-enabled/detail.log detail reply_log { filename = "/var/log/radius/radacct/%{%{Packet-Src-IP-Address}:-%{Packet-Src-IPv6-Address}}/reply-detail-%Y%m%d" header = "%t" permissions = 384 locking = no escape_filenames = no log_packet_header = no } # Loading module "pre_proxy_log" from file /etc/raddb/mods-enabled/detail.log detail pre_proxy_log { filename = "/var/log/radius/radacct/%{%{Packet-Src-IP-Address}:-%{Packet-Src-IPv6-Address}}/pre-proxy-detail-%Y%m%d" header = "%t" permissions = 384 locking = no escape_filenames = no log_packet_header = no } # Loading module "post_proxy_log" from file /etc/raddb/mods-enabled/detail.log detail post_proxy_log { filename = "/var/log/radius/radacct/%{%{Packet-Src-IP-Address}:-%{Packet-Src-IPv6-Address}}/post-proxy-detail-%Y%m%d" header = "%t" permissions = 384 locking = no escape_filenames = no log_packet_header = no } # Loaded module rlm_dhcp # Loading module "dhcp" from file /etc/raddb/mods-enabled/dhcp # Loaded module rlm_digest # Loading module "digest" from file /etc/raddb/mods-enabled/digest # Loaded module rlm_dynamic_clients # Loading module "dynamic_clients" from file /etc/raddb/mods-enabled/dynamic_clients # Loaded module rlm_eap # Loading module "eap" from file /etc/raddb/mods-enabled/eap eap { default_eap_type = "md5" timer_expire = 60 ignore_unknown_eap_types = no cisco_accounting_username_bug = no max_sessions = 16384 } # Loaded module rlm_exec # Loading module "echo" from file /etc/raddb/mods-enabled/echo exec echo { wait = yes program = "/bin/echo %{User-Name}" input_pairs = "request" output_pairs = "reply" shell_escape = yes } # Loading module "exec" from file /etc/raddb/mods-enabled/exec exec { wait = no input_pairs = "request" shell_escape = yes timeout = 10 } # Loaded module rlm_expiration # Loading module "expiration" from file /etc/raddb/mods-enabled/expiration # Loaded module rlm_expr # Loading module "expr" from file /etc/raddb/mods-enabled/expr expr { safe_characters = "@abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789.-_: /äéöüàâæçèéêëîïôœùûüaÿÄÉÖÜßÀÂÆÇÈÉÊËÎÏÔŒÙÛÜŸ" } # Loaded module rlm_files # Loading module "files" from file /etc/raddb/mods-enabled/files files { filename = "/etc/raddb/mods-config/files/authorize" acctusersfile = "/etc/raddb/mods-config/files/accounting" preproxy_usersfile = "/etc/raddb/mods-config/files/pre-proxy" } # Loaded module rlm_linelog # Loading module "linelog" from file /etc/raddb/mods-enabled/linelog linelog { filename = "/var/log/radius/linelog" escape_filenames = no syslog_severity = "info" permissions = 384 format = "This is a log message for %{User-Name}" reference = "messages.%{%{reply:Packet-Type}:-default}" } # Loading module "log_accounting" from file /etc/raddb/mods-enabled/linelog linelog log_accounting { filename = "/var/log/radius/linelog-accounting" escape_filenames = no syslog_severity = "info" permissions = 384 format = "" reference = "Accounting-Request.%{%{Acct-Status-Type}:-unknown}" } # Loaded module rlm_logintime # Loading module "logintime" from file /etc/raddb/mods-enabled/logintime logintime { minimum_timeout = 60 } # Loaded module rlm_mschap # Loading module "mschap" from file /etc/raddb/mods-enabled/mschap mschap { use_mppe = yes require_encryption = no require_strong = no with_ntdomain_hack = yes passchange { } allow_retry = yes winbind_retry_with_normalised_username = no } # Loading module "ntlm_auth" from file /etc/raddb/mods-enabled/ntlm_auth exec ntlm_auth { wait = yes program = "/path/to/ntlm_auth --request-nt-key --domain=MYDOMAIN --username=%{mschap:User-Name} --password=%{User-Password}" shell_escape = yes } # Loaded module rlm_pap # Loading module "pap" from file /etc/raddb/mods-enabled/pap pap { normalise = yes } # Loaded module rlm_passwd # Loading module "etc_passwd" from file /etc/raddb/mods-enabled/passwd passwd etc_passwd { filename = "/etc/passwd" format = "*User-Name:Crypt-Password:" delimiter = ":" ignore_nislike = no ignore_empty = yes allow_multiple_keys = no hash_size = 100 } # Loaded module rlm_preprocess # Loading module "preprocess" from file /etc/raddb/mods-enabled/preprocess preprocess { huntgroups = "/etc/raddb/mods-config/preprocess/huntgroups" hints = "/etc/raddb/mods-config/preprocess/hints" with_ascend_hack = no ascend_channels_per_line = 23 with_ntdomain_hack = no with_specialix_jetstream_hack = no with_cisco_vsa_hack = no with_alvarion_vsa_hack = no } # Loaded module rlm_radutmp # Loading module "radutmp" from file /etc/raddb/mods-enabled/radutmp radutmp { filename = "/var/log/radius/radutmp" username = "%{User-Name}" case_sensitive = yes check_with_nas = yes permissions = 384 caller_id = yes } # Loaded module rlm_realm # Loading module "IPASS" from file /etc/raddb/mods-enabled/realm realm IPASS { format = "prefix" delimiter = "/" ignore_default = no ignore_null = no } # Loading module "suffix" from file /etc/raddb/mods-enabled/realm realm suffix { format = "suffix" delimiter = "@" ignore_default = no ignore_null = no } # Loading module "realmpercent" from file /etc/raddb/mods-enabled/realm realm realmpercent { format = "suffix" delimiter = "%" ignore_default = no ignore_null = no } # Loading module "ntdomain" from file /etc/raddb/mods-enabled/realm realm ntdomain { format = "prefix" delimiter = "\\" ignore_default = no ignore_null = no } # Loaded module rlm_replicate # Loading module "replicate" from file /etc/raddb/mods-enabled/replicate # Loaded module rlm_soh # Loading module "soh" from file /etc/raddb/mods-enabled/soh soh { dhcp = yes } # Loading module "sradutmp" from file /etc/raddb/mods-enabled/sradutmp radutmp sradutmp { filename = "/var/log/radius/sradutmp" username = "%{User-Name}" case_sensitive = yes check_with_nas = yes permissions = 420 caller_id = no } # Loaded module rlm_unix # Loading module "unix" from file /etc/raddb/mods-enabled/unix unix { radwtmp = "/var/log/radius/radwtmp" } Creating attribute Unix-Group # Loaded module rlm_unpack # Loading module "unpack" from file /etc/raddb/mods-enabled/unpack # Loaded module rlm_utf8 # Loading module "utf8" from file /etc/raddb/mods-enabled/utf8 # Loaded module rlm_sql # Loading module "sql" from file /etc/raddb/mods-enabled/sql sql { driver = "rlm_sql_mysql" server = "localhost" port = 3306 login = "radius" password = <<< secret >>> radius_db = "radius" read_groups = yes read_profiles = yes read_clients = yes delete_stale_sessions = yes sql_user_name = "%{User-Name}" default_user_profile = "" client_query = "SELECT id, nasname, shortname, type, secret, server FROM nas" authorize_check_query = "SELECT id, username, attribute, value, op FROM radcheck WHERE username = '%{SQL-User-Name}' ORDER BY id" authorize_reply_query = "SELECT id, username, attribute, value, op FROM radreply WHERE username = '%{SQL-User-Name}' ORDER BY id" authorize_group_check_query = "SELECT id, groupname, attribute, Value, op FROM radgroupcheck WHERE groupname = '%{SQL-Group}' ORDER BY id" authorize_group_reply_query = "SELECT id, groupname, attribute, value, op FROM radgroupreply WHERE groupname = '%{SQL-Group}' ORDER BY id" group_membership_query = "SELECT groupname FROM radusergroup WHERE username = '%{SQL-User-Name}' ORDER BY priority" simul_count_query = "SELECT COUNT(*) FROM radacct WHERE username = '%{SQL-User-Name}' AND acctstoptime IS NULL" simul_verify_query = "SELECT radacctid, acctsessionid, username, nasipaddress, nasportid, framedipaddress, callingstationid, framedprotocol FROM radacct WHERE username = '%{SQL-User-Name}' AND acctstoptime IS NULL" safe_characters = "@abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789.-_: /" accounting { reference = "%{tolower:type.%{Acct-Status-Type}.query}" type { accounting-on { query = "UPDATE radacct SET acctstoptime = FROM_UNIXTIME(%{integer:Event-Timestamp}), acctsessiontime = '%{integer:Event-Timestamp}' - UNIX_TIMESTAMP(acctstarttime), acctterminatecause = '%{%{Acct-Terminate-Cause}:-NAS-Reboot}' WHERE acctstoptime IS NULL AND nasipaddress = '%{NAS-IP-Address}' AND acctstarttime <= FROM_UNIXTIME(%{integer:Event-Timestamp})" } accounting-off { query = "UPDATE radacct SET acctstoptime = FROM_UNIXTIME(%{integer:Event-Timestamp}), acctsessiontime = '%{integer:Event-Timestamp}' - UNIX_TIMESTAMP(acctstarttime), acctterminatecause = '%{%{Acct-Terminate-Cause}:-NAS-Reboot}' WHERE acctstoptime IS NULL AND nasipaddress = '%{NAS-IP-Address}' AND acctstarttime <= FROM_UNIXTIME(%{integer:Event-Timestamp})" } start { query = "INSERT INTO radacct (acctsessionid, acctuniqueid, username, realm, nasipaddress, nasportid, nasporttype, acctstarttime, acctupdatetime, acctstoptime, acctsessiontime, acctauthentic, connectinfo_start, connectinfo_stop, acctinputoctets, acctoutputoctets, calledstationid, callingstationid, acctterminatecause, servicetype, framedprotocol, framedipaddress) VALUES ('%{Acct-Session-Id}', '%{Acct-Unique-Session-Id}', '%{SQL-User-Name}', '%{Realm}', '%{NAS-IP-Address}', '%{%{NAS-Port-ID}:-%{NAS-Port}}', '%{NAS-Port-Type}', FROM_UNIXTIME(%{integer:Event-Timestamp}), FROM_UNIXTIME(%{integer:Event-Timestamp}), NULL, '0', '%{Acct-Authentic}', '%{Connect-Info}', '', '0', '0', '%{Called-Station-Id}', '%{Calling-Station-Id}', '', '%{Service-Type}', '%{Framed-Protocol}', '%{Framed-IP-Address}')" } interim-update { query = "UPDATE radacct SET acctupdatetime = (@acctupdatetime_old:=acctupdatetime), acctupdatetime = FROM_UNIXTIME(%{integer:Event-Timestamp}), acctinterval = %{integer:Event-Timestamp} - UNIX_TIMESTAMP(@acctupdatetime_old), framedipaddress = '%{Framed-IP-Address}', acctsessiontime = %{%{Acct-Session-Time}:-NULL}, acctinputoctets = '%{%{Acct-Input-Gigawords}:-0}' << 32 | '%{%{Acct-Input-Octets}:-0}', acctoutputoctets = '%{%{Acct-Output-Gigawords}:-0}' << 32 | '%{%{Acct-Output-Octets}:-0}' WHERE AcctUniqueId = '%{Acct-Unique-Session-Id}'" } stop { query = "UPDATE radacct SET acctstoptime = FROM_UNIXTIME(%{integer:Event-Timestamp}), acctsessiontime = %{%{Acct-Session-Time}:-NULL}, acctinputoctets = '%{%{Acct-Input-Gigawords}:-0}' << 32 | '%{%{Acct-Input-Octets}:-0}', acctoutputoctets = '%{%{Acct-Output-Gigawords}:-0}' << 32 | '%{%{Acct-Output-Octets}:-0}', acctterminatecause = '%{Acct-Terminate-Cause}', connectinfo_stop = '%{Connect-Info}' WHERE AcctUniqueId = '%{Acct-Unique-Session-Id}'" } } } post-auth { reference = ".query" query = "INSERT INTO radpostauth (username, pass, reply, authdate) VALUES ( '%{SQL-User-Name}', '%{%{User-Password}:-%{Chap-Password}}', '%{reply:Packet-Type}', '%S')" } } rlm_sql (sql): Driver rlm_sql_mysql (module rlm_sql_mysql) loaded and linked Creating attribute SQL-Group instantiate { } # Instantiating module "reject" from file /etc/raddb/mods-enabled/always # Instantiating module "fail" from file /etc/raddb/mods-enabled/always # Instantiating module "ok" from file /etc/raddb/mods-enabled/always # Instantiating module "handled" from file /etc/raddb/mods-enabled/always # Instantiating module "invalid" from file /etc/raddb/mods-enabled/always # Instantiating module "userlock" from file /etc/raddb/mods-enabled/always # Instantiating module "notfound" from file /etc/raddb/mods-enabled/always # Instantiating module "noop" from file /etc/raddb/mods-enabled/always # Instantiating module "updated" from file /etc/raddb/mods-enabled/always # Instantiating module "attr_filter.post-proxy" from file /etc/raddb/mods-enabled/attr_filter reading pairlist file /etc/raddb/mods-config/attr_filter/post-proxy # Instantiating module "attr_filter.pre-proxy" from file /etc/raddb/mods-enabled/attr_filter reading pairlist file /etc/raddb/mods-config/attr_filter/pre-proxy # Instantiating module "attr_filter.access_reject" from file /etc/raddb/mods-enabled/attr_filter reading pairlist file /etc/raddb/mods-config/attr_filter/access_reject [/etc/raddb/mods-config/attr_filter/access_reject]:11 Check item "FreeRADIUS-Response-Delay" found in filter list for realm "DEFAULT". [/etc/raddb/mods-config/attr_filter/access_reject]:11 Check item "FreeRADIUS-Response-Delay-USec" found in filter list for realm "DEFAULT". # Instantiating module "attr_filter.access_challenge" from file /etc/raddb/mods-enabled/attr_filter reading pairlist file /etc/raddb/mods-config/attr_filter/access_challenge # Instantiating module "attr_filter.accounting_response" from file /etc/raddb/mods-enabled/attr_filter reading pairlist file /etc/raddb/mods-config/attr_filter/accounting_response # Instantiating module "cache_eap" from file /etc/raddb/mods-enabled/cache_eap rlm_cache (cache_eap): Driver rlm_cache_rbtree (module rlm_cache_rbtree) loaded and linked # Instantiating module "detail" from file /etc/raddb/mods-enabled/detail # Instantiating module "auth_log" from file /etc/raddb/mods-enabled/detail.log rlm_detail (auth_log): 'User-Password' suppressed, will not appear in detail output # Instantiating module "reply_log" from file /etc/raddb/mods-enabled/detail.log # Instantiating module "pre_proxy_log" from file /etc/raddb/mods-enabled/detail.log # Instantiating module "post_proxy_log" from file /etc/raddb/mods-enabled/detail.log # Instantiating module "eap" from file /etc/raddb/mods-enabled/eap # Linked to sub-module rlm_eap_md5 # Linked to sub-module rlm_eap_leap # Linked to sub-module rlm_eap_gtc gtc { challenge = "Password: " auth_type = "PAP" } # Linked to sub-module rlm_eap_tls tls { tls = "tls-common" } tls-config tls-common { verify_depth = 0 ca_path = "/etc/raddb/certs" pem_file_type = yes private_key_file = "/etc/raddb/certs/server.pem" certificate_file = "/etc/raddb/certs/server.pem" ca_file = "/etc/raddb/certs/ca.pem" private_key_password = <<< secret >>> dh_file = "/etc/raddb/certs/dh" fragment_size = 1024 include_length = yes auto_chain = yes check_crl = no check_all_crl = no cipher_list = "DEFAULT" cipher_server_preference = no ecdh_curve = "prime256v1" cache { enable = no lifetime = 24 max_entries = 255 } verify { skip_if_ocsp_ok = no } ocsp { enable = no override_cert_url = yes url = "http://127.0.0.1/ocsp/" use_nonce = yes timeout = 0 softfail = no } } # Linked to sub-module rlm_eap_ttls ttls { tls = "tls-common" default_eap_type = "md5" copy_request_to_tunnel = no use_tunneled_reply = yes virtual_server = "inner-tunnel" include_length = yes require_client_cert = no } tls: Using cached TLS configuration from previous invocation # Linked to sub-module rlm_eap_peap peap { tls = "tls-common" default_eap_type = "mschapv2" copy_request_to_tunnel = no use_tunneled_reply = yes proxy_tunneled_request_as_eap = yes virtual_server = "inner-tunnel" soh = no require_client_cert = no } tls: Using cached TLS configuration from previous invocation # Linked to sub-module rlm_eap_mschapv2 mschapv2 { with_ntdomain_hack = no send_error = no } # Instantiating module "expiration" from file /etc/raddb/mods-enabled/expiration # Instantiating module "files" from file /etc/raddb/mods-enabled/files reading pairlist file /etc/raddb/mods-config/files/authorize reading pairlist file /etc/raddb/mods-config/files/accounting reading pairlist file /etc/raddb/mods-config/files/pre-proxy # Instantiating module "linelog" from file /etc/raddb/mods-enabled/linelog # Instantiating module "log_accounting" from file /etc/raddb/mods-enabled/linelog # Instantiating module "logintime" from file /etc/raddb/mods-enabled/logintime # Instantiating module "mschap" from file /etc/raddb/mods-enabled/mschap rlm_mschap (mschap): using internal authentication # Instantiating module "pap" from file /etc/raddb/mods-enabled/pap # Instantiating module "etc_passwd" from file /etc/raddb/mods-enabled/passwd rlm_passwd: nfields: 3 keyfield 0(User-Name) listable: no # Instantiating module "preprocess" from file /etc/raddb/mods-enabled/preprocess reading pairlist file /etc/raddb/mods-config/preprocess/huntgroups reading pairlist file /etc/raddb/mods-config/preprocess/hints # Instantiating module "IPASS" from file /etc/raddb/mods-enabled/realm # Instantiating module "suffix" from file /etc/raddb/mods-enabled/realm # Instantiating module "realmpercent" from file /etc/raddb/mods-enabled/realm # Instantiating module "ntdomain" from file /etc/raddb/mods-enabled/realm # Instantiating module "sql" from file /etc/raddb/mods-enabled/sql rlm_sql_mysql: libmysql version: 5.5.60-MariaDB mysql { tls { } warnings = "auto" } rlm_sql (sql): Attempting to connect to database "radius" rlm_sql (sql): Initialising connection pool pool { start = 5 min = 3 max = 32 spare = 10 uses = 0 lifetime = 0 cleanup_interval = 30 idle_timeout = 60 retry_delay = 30 spread = no } rlm_sql (sql): Opening additional connection (0), 1 of 32 pending slots used rlm_sql_mysql: Starting connect to MySQL server rlm_sql_mysql: Connected to database 'radius' on Localhost via UNIX socket, server version 5.5.60-MariaDB, protocol version 10 rlm_sql (sql): Opening additional connection (1), 1 of 31 pending slots used rlm_sql_mysql: Starting connect to MySQL server rlm_sql_mysql: Connected to database 'radius' on Localhost via UNIX socket, server version 5.5.60-MariaDB, protocol version 10 rlm_sql (sql): Opening additional connection (2), 1 of 30 pending slots used rlm_sql_mysql: Starting connect to MySQL server rlm_sql_mysql: Connected to database 'radius' on Localhost via UNIX socket, server version 5.5.60-MariaDB, protocol version 10 rlm_sql (sql): Opening additional connection (3), 1 of 29 pending slots used rlm_sql_mysql: Starting connect to MySQL server rlm_sql_mysql: Connected to database 'radius' on Localhost via UNIX socket, server version 5.5.60-MariaDB, protocol version 10 rlm_sql (sql): Opening additional connection (4), 1 of 28 pending slots used rlm_sql_mysql: Starting connect to MySQL server rlm_sql_mysql: Connected to database 'radius' on Localhost via UNIX socket, server version 5.5.60-MariaDB, protocol version 10 rlm_sql (sql): Processing generate_sql_clients rlm_sql (sql) in generate_sql_clients: query is SELECT id, nasname, shortname, type, secret, server FROM nas rlm_sql (sql): Reserved connection (0) rlm_sql (sql): Executing select query: SELECT id, nasname, shortname, type, secret, server FROM nas rlm_sql (sql): Released connection (0) Need 5 more connections to reach 10 spares rlm_sql (sql): Opening additional connection (5), 1 of 27 pending slots used rlm_sql_mysql: Starting connect to MySQL server rlm_sql_mysql: Connected to database 'radius' on Localhost via UNIX socket, server version 5.5.60-MariaDB, protocol version 10 } # modules radiusd: #### Loading Virtual Servers #### server { # from file /etc/raddb/radiusd.conf } # server server default { # from file /etc/raddb/sites-enabled/default # Loading authenticate {...} # Loading authorize {...} Ignoring "ldap" (see raddb/mods-available/README.rst) # Loading preacct {...} # Loading accounting {...} # Loading post-proxy {...} # Loading post-auth {...} } # server default server inner-tunnel { # from file /etc/raddb/sites-enabled/inner-tunnel # Loading authenticate {...} # Loading authorize {...} # Loading session {...} # Loading post-proxy {...} # Loading post-auth {...} # Skipping contents of 'if' as it is always 'false' -- /etc/raddb/sites-enabled/inner-tunnel:330 } # server inner-tunnel thread pool { start_servers = 5 max_servers = 32 min_spare_servers = 3 max_spare_servers = 10 max_requests_per_server = 0 cleanup_delay = 5 max_queue_size = 65536 auto_limit_acct = no } Thread spawned new child 1. Total threads in pool: 1 Thread spawned new child 2. Total threads in pool: 2 Thread spawned new child 3. Total threads in pool: 3 Thread spawned new child 4. Total threads in pool: 4 Thread spawned new child 5. Total threads in pool: 5 Thread pool initialized radiusd: #### Opening IP addresses and Ports #### listen { type = "auth+acct" virtual_server = "default" ipaddr = 192.168.0.105 port = 2083 proto = "tcp" tls { verify_depth = 0 ca_path = "/etc/raddb/certs" pem_file_type = yes private_key_file = "/etc/raddb/certs/server.pem" certificate_file = "/etc/raddb/certs/server.pem" ca_file = "/etc/raddb/certs/ca.pem" private_key_password = <<< secret >>> dh_file = "/etc/raddb/certs/dh" fragment_size = 8192 include_length = yes auto_chain = yes check_crl = no check_all_crl = no cipher_list = "DEFAULT" cipher_server_preference = no require_client_cert = yes ecdh_curve = "prime256v1" cache { enable = no lifetime = 24 max_entries = 255 } verify { skip_if_ocsp_ok = no } ocsp { enable = no override_cert_url = no use_nonce = yes timeout = 0 softfail = no } } limit { max_connections = 16 lifetime = 0 idle_timeout = 30 } Thread 4 waiting to be assigned a request Thread 5 waiting to be assigned a request Thread 1 waiting to be assigned a request Thread 2 waiting to be assigned a request Thread 3 waiting to be assigned a request Failed binding to auth+acct address 192.168.0.105 port 2083 (TLS) bound to server default: Cannot assign requested address /etc/raddb/sites-enabled/tls[7]: Error binding to port for 192.168.0.105 port 2083 [root@localhost sites-enabled]# 为什么会这样
最新发布
12-02
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

夜流冰

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值