Fri 2023-01-27 16:05:27 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-logind[1959]: The system will reboot now! Fri 2023-01-27 16:05:27 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-logind[1959]: System is rebooting. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping Session 2 of User root... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Removed slice Slice /system/modprobe. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Removed slice Slice /system/sshd-keygen. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com sshd[4148]: pam_unix(sshd:session): session closed for user root Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Removed slice Slice /system/systemd-hibernate-resume. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Multi-User System. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Login Prompts. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target rpc_pipefs.target. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target RPC Port Mapper. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Timer Units. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dnf-makecache.timer: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped dnf makecache --timer. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: logrotate.timer: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Daily rotation of log files. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-tmpfiles-clean.timer: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Daily Cleanup of Temporary Directories. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: lvm2-lvmpolld.socket: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Closed LVM2 poll daemon socket. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-coredump.socket: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Closed Process Core Dump Socket. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-rfkill.socket: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Closed Load/Save RF Kill Switch Status /dev/rfkill Watch. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Unmounting RPC Pipe File System... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com chronyd[227203]: chronyd exiting Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping NTP client/server... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping Command Scheduler... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com crond[3028]: (CRON) INFO (Shutting down) Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping Restore /run/initramfs on shutdown... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping Getty on tty1... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Grub2 systemctl reboot --boot-loader-menu=... support was skipped because of an unmet condition check (ConditionPathExists=/run/systemd/reboot-to-boot-loader-menu). Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping irqbalance daemon... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping Authorization Manager... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping The restraint harness.... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com restraintd[3031]: restraintd quit on received signal: Terminated(15) Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping System Logging Service... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com restraintd[3031]: [*] Stopping mainloop Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping Serial Getty on ttyS1... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping OpenSSH server daemon... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com sshd[2002]: Received signal 15; terminating. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping Hostname Service... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rsyslogd[1953]: [origin software="rsyslogd" swVersion="8.2102.0-109.el9" x-pid="1953" x-info="https://www.rsyslog.com"] exiting on signal 15. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: sda1: Can't mount, would change RO state Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping Load/Save Random Seed... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping Dynamic System Tuning Daemon... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: irqbalance.service: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped irqbalance daemon. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: irqbalance.service: Consumed 20.360s CPU time. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: polkit.service: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Authorization Manager. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: rsyslog.service: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped System Logging Service. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: rsyslog.service: Consumed 15.768s CPU time. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: sshd.service: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped OpenSSH server daemon. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: getty@tty1.service: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Getty on tty1. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: serial-getty@ttyS1.service: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Serial Getty on ttyS1. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: crond.service: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Command Scheduler. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: crond.service: Consumed 1.837s CPU time. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: restraintd.service: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: restraintd.service: Unit process 3037 (10_bash_login) remains running after unit stopped. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: restraintd.service: Unit process 3091 (runtest.sh) remains running after unit stopped. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: restraintd.service: Unit process 300086 (sleep) remains running after unit stopped. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped The restraint harness.. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: restraintd.service: Consumed 1min 12.183s CPU time. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: chronyd.service: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped NTP client/server. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-hostnamed.service: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Hostname Service. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Unmounted RPC Pipe File System. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-random-seed.service: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Load/Save Random Seed. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: session-2.scope: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Session 2 of User root. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: session-2.scope: Consumed 49min 31.626s CPU time. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-logind[1959]: Session 2 logged out. Waiting for processes to exit. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Removed slice Slice /system/getty. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Removed slice Slice /system/serial-getty. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target sshd-keygen.target. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target System Time Synchronized. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target System Time Set. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping User Login Management... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping Permit User Sessions... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping User Manager for UID 0... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2040]: Activating special unit Exit the Session... Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2040]: Removed slice User Background Tasks Slice. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2040]: Stopped target Main User Target. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2040]: Stopped target Basic System. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2040]: Stopped target Paths. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2040]: Stopped target Sockets. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2040]: Stopped target Timers. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2040]: Stopped Daily Cleanup of User's Temporary Directories. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-logind[1959]: Removed session 2. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2040]: Closed D-Bus User Message Bus Socket. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2040]: Closed PipeWire Multimedia System Socket. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2040]: Stopped Create User's Volatile Files and Directories. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2040]: Removed slice User Application Slice. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2040]: Reached target Shutdown. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2040]: Finished Exit the Session. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2040]: Reached target Exit the Session. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: user@0.service: Deactivated successfully. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped User Manager for UID 0. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: user@0.service: Consumed 22.576s CPU time. Fri 2023-01-27 16:05:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-logind.service: Deactivated successfully. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped User Login Management. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-logind.service: Consumed 2min 20.305s CPU time. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-user-sessions.service: Deactivated successfully. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Permit User Sessions. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target User and Group Name Lookups. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Remote File Systems. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Unmounting /var/crash... Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping User Runtime Directory /run/user/0... Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-user-0.mount: Deactivated successfully. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Unmounted /run/user/0. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped User Runtime Directory /run/user/0. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: var-crash.mount: Deactivated successfully. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Unmounted /var/crash. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Removed slice User Slice of UID 0. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: user-0.slice: Consumed 49min 54.429s CPU time. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Network is Online. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Preparation for Remote File Systems. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target NFS client services. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: NetworkManager-wait-online.service: Deactivated successfully. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Network Manager Wait Online. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping GSSAPI Proxy Daemon... Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: gssproxy.service: Deactivated successfully. Fri 2023-01-27 16:05:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped GSSAPI Proxy Daemon. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dracut-shutdown.service: Deactivated successfully. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Restore /run/initramfs on shutdown. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dracut-shutdown.service: Consumed 9.990s CPU time. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: tuned.service: Deactivated successfully. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Dynamic System Tuning Daemon. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: tuned.service: Consumed 31.623s CPU time. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Network. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping Network Manager... Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1945]: [1674853535.7425] caught SIGTERM, shutting down normally. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1945]: [1674853535.7684] dhcp4 (eno1): canceled DHCP transaction Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1945]: [1674853535.7687] dhcp4 (eno1): activation: beginning transaction (timeout in 45 seconds) Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1945]: [1674853535.7688] dhcp4 (eno1): state changed no lease Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1945]: [1674853535.7699] manager: NetworkManager state is now CONNECTED_SITE Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Requested transaction contradicts existing jobs: Transaction for NetworkManager-dispatcher.service/start is destructive (dbus.socket has 'stop' job queued, but 'start' is included in transaction). Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1945]: [1674853535.8407] exiting (success) Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NetworkManager (1945) used greatest stack depth: 10160 bytes left Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: NetworkManager.service: Deactivated successfully. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Network Manager. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: NetworkManager.service: Consumed 11.064s CPU time. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Basic System. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Preparation for Network. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Path Units. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Slice Units. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Removed slice User and Session Slice. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: user.slice: Consumed 49min 54.429s CPU time. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Socket Units. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: sssd-kcm.socket: Deactivated successfully. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Closed SSSD Kerberos Cache Manager responder socket. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping D-Bus System Message Bus... Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com dbus-broker[1944]: Dispatched 35626 messages @ 13(±35)μs / message. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-network-generator.service: Deactivated successfully. Fri 2023-01-27 16:05:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Generate network units from Kernel command line. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dbus-broker.service: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped D-Bus System Message Bus. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dbus-broker.service: Consumed 6.788s CPU time. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dbus.socket: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Closed D-Bus System Message Bus Socket. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target System Initialization. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: proc-sys-fs-binfmt_misc.automount: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Unset automount Arbitrary Executable File Formats File System Automount Point. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Local Encrypted Volumes. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-ask-password-wall.path: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Forward Password Requests to Wall Directory Watch. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Local Integrity Protected Volumes. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Swaps. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Local Verity Protected Volumes. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Deactivating swap /dev/cs_hpe-ml350gen9-01/swap... Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: nis-domainname.service: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Read and set NIS domainname from /etc/sysconfig/network. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-boot-update.service: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Automatic Boot Loader Update. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-sysctl.service: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Apply Kernel Variables. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping Record System Boot/Shutdown in UTMP... Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: proc-sys-fs-binfmt_misc.mount: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Unmounted /run/credentials/systemd-sysctl.service. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dev-disk-by\x2did-dm\x2duuid\x2dLVM\x2dDjnXOGCG8jOT2tk2kE67kWgo09lsLqGw5s0zpUlRVeZ3VfyfvFdT2GzTOLW9H3Go.swap: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Deactivated swap /dev/disk/by-id/dm-uuid-LVM-DjnXOGCG8jOT2tk2kE67kWgo09lsLqGw5s0zpUlRVeZ3VfyfvFdT2GzTOLW9H3Go. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dev-disk-by\x2did-dm\x2dname\x2dcs_hpe\x2d\x2dml350gen9\x2d\x2d01\x2dswap.swap: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Deactivated swap /dev/disk/by-id/dm-name-cs_hpe--ml350gen9--01-swap. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dev-cs_hpe\x2dml350gen9\x2d01-swap.swap: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Deactivated swap /dev/cs_hpe-ml350gen9-01/swap. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dev-disk-by\x2duuid-5e10a62a\x2dd27f\x2d4e50\x2d9ad9\x2ded8b9f04e2c1.swap: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Deactivated swap /dev/disk/by-uuid/5e10a62a-d27f-4e50-9ad9-ed8b9f04e2c1. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dev-dm\x2d1.swap: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Deactivated swap /dev/dm-1. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dev-mapper-cs_hpe\x2d\x2dml350gen9\x2d\x2d01\x2dswap.swap: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Deactivated swap /dev/mapper/cs_hpe--ml350gen9--01-swap. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-update-utmp.service: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Record System Boot/Shutdown in UTMP. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping Security Auditing Service... Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com auditd[1906]: The audit daemon is exiting. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: audit: type=1305 audit(1674853536.457:849): op=set audit_pid=0 old=1906 auid=4294967295 ses=4294967295 subj=system_u:system_r:auditd_t:s0 res=1 Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: auditd.service: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Security Auditing Service. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: auditd.service: Consumed 1.032s CPU time. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: audit: type=1131 audit(1674853536.506:850): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:init_t:s0 msg='unit=auditd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Create Volatile Files and Directories. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: audit: type=1131 audit(1674853536.554:851): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:init_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Local File Systems. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Unmounting /boot... Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Unmounting /home... Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Unmounting /run/credentials/systemd-tmpfiles-setup.service... Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: XFS (dm-2): Unmounting Filesystem Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Unmounting /run/credentials/systemd-tmpfiles-setup-dev.service... Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com umount[300216]: umount: /run/credentials/systemd-tmpfiles-setup.service: no mount point specified. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Mount process exited, code=exited, status=32/n/a Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Failed with result 'exit-code'. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Unmounted /run/credentials/systemd-tmpfiles-setup.service. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: XFS (sda1): Unmounting Filesystem Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Fri 2023-01-27 16:05:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Unmounted /run/credentials/systemd-tmpfiles-setup-dev.service. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: home.mount: Deactivated successfully. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Unmounted /home. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: boot.mount: Deactivated successfully. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Unmounted /boot. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Preparation for Local File Systems. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Unmount All Filesystems. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling... Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-remount-fs.service: Deactivated successfully. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Remount Root and Kernel File Systems. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: audit: type=1131 audit(1674853537.104:852): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:init_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Create Static Device Nodes in /dev. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: audit: type=1131 audit(1674853537.108:853): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:init_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com lvm[300220]: 3 logical volume(s) in volume group "cs_hpe-ml350gen9-01" unmonitored Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: lvm2-monitor.service: Deactivated successfully. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: audit: type=1131 audit(1674853537.468:854): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:init_t:s0 msg='unit=lvm2-monitor comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target System Shutdown. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Late Shutdown Services. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-reboot.service: Deactivated successfully. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished System Reboot. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: audit: type=1130 audit(1674853537.488:855): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:init_t:s0 msg='unit=systemd-reboot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: audit: type=1131 audit(1674853537.488:856): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:init_t:s0 msg='unit=systemd-reboot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target System Reboot. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Shutting down. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: audit: type=1334 audit(1674853537.506:857): prog-id=0 op=UNLOAD Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: audit: type=1334 audit(1674853537.506:858): prog-id=0 op=UNLOAD Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Using hardware watchdog 'HPE iLO2+ HW Watchdog Timer', version 0, device /dev/watchdog0 Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: watchdog: watchdog0: watchdog did not stop! Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Watchdog running with a timeout of 10min. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-shutdown[1]: Using hardware watchdog 'HPE iLO2+ HW Watchdog Timer', version 0, device /dev/watchdog0 Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-shutdown[1]: Watchdog running with a timeout of 10min. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-shutdown[1]: Syncing filesystems and block devices. Fri 2023-01-27 16:05:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-shutdown[1]: Sending SIGTERM to remaining processes... Fri 2023-01-27 16:05:38 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-journald[1629]: Received SIGTERM from PID 1 (systemd-shutdow). Fri 2023-01-27 16:05:38 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rngd[61081]: [hwrng ]: Shutting down Fri 2023-01-27 16:05:38 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rngd[61081]: [rdrand]: Shutting down Fri 2023-01-27 16:05:38 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rngd[61081]: [jitter]: Shutting down Fri 2023-01-27 16:05:38 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-journald[1629]: Journal stopped -- Boot 5a6cfe1bd77e452283065caca85a1cf6 -- Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: microcode: microcode updated early to revision 0x49, date = 2021-08-11 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Linux version 5.14.0-246.rt14.245.1955_759844798.el9.x86_64+debug (root@runner-ia7yd-k9-project-18194050-concurrent-0) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-33.el9) #1 SMP PREEMPT_RT Fri Jan 27 16:35:10 UTC 2023 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/vmlinuz-5.14.0-246.rt14.245.1955_759844798.el9.x86_64+debug root=/dev/mapper/cs_hpe--ml350gen9--01-root ro efi=runtime resume=/dev/mapper/cs_hpe--ml350gen9--01-swap rd.lvm.lv=cs_hpe-ml350gen9-01/root rd.lvm.lv=cs_hpe-ml350gen9-01/swap console=ttyS1,115200n81 crashkernel=1G-2G:384M,2G-3G:512M,3G-4G:768M,4G-16G:1G,16G-64G:2G,64G-128G:2G,128G-:4G Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: signal: max sigframe size: 1776 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: BIOS-provided physical RAM map: Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: BIOS-e820: [mem 0x0000000000000000-0x0000000000093fff] usable Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: BIOS-e820: [mem 0x0000000000094000-0x000000000009ffff] reserved Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: BIOS-e820: [mem 0x0000000000100000-0x000000005a650fff] usable Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: BIOS-e820: [mem 0x000000005a651000-0x000000005b4c0fff] reserved Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: BIOS-e820: [mem 0x000000005b4c1000-0x00000000790fefff] usable Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: BIOS-e820: [mem 0x00000000790ff000-0x00000000791fefff] reserved Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: BIOS-e820: [mem 0x00000000791ff000-0x000000007b5fefff] ACPI NVS Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: BIOS-e820: [mem 0x000000007b5ff000-0x000000007b7fefff] ACPI data Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: BIOS-e820: [mem 0x000000007b7ff000-0x000000007b7fffff] usable Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: BIOS-e820: [mem 0x000000007b800000-0x000000008fffffff] reserved Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: BIOS-e820: [mem 0x00000000ff800000-0x00000000ffffffff] reserved Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: BIOS-e820: [mem 0x0000000100000000-0x0000000c7fffffff] usable Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NX (Execute Disable) protection: active Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SMBIOS 2.8 present. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMI: HP ProLiant ML350 Gen9/ProLiant ML350 Gen9, BIOS P92 10/21/2019 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tsc: Fast TSC calibration failed Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: last_pfn = 0xc80000 max_arch_pfn = 0x400000000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: last_pfn = 0x7b800 max_arch_pfn = 0x400000000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Using GB pages for direct mapping Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: RAMDISK: [mem 0x33e69000-0x35f2cfff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Early table checksum verification disabled Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: RSDP 0x00000000000F4F00 000024 (v02 HP ) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: XSDT 0x000000007B7E8188 0000EC (v01 HP ProLiant 00000001 01000013) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: FACP 0x000000007B7F6000 00010C (v05 HP ProLiant 00000001 HP 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: DSDT 0x000000007B7D8000 00F6ED (v02 HP DSDT 00000002 HPAG 00020000) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: FACS 0x000000007B4A1000 000040 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: UEFI 0x000000007B4D9000 000042 (v01 HP ProLiant 00000002 01000013) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: MCEJ 0x000000007B7FC000 000130 (v01 HP ProLiant 00000001 INTL 0100000D) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: SSDT 0x000000007B7FB000 000064 (v02 HP SpsNvs 00000002 INTL 20130328) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: HEST 0x000000007B7FA000 0000A8 (v01 HP ProLiant 00000001 INTL 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: BERT 0x000000007B7F9000 000030 (v01 HP ProLiant 00000001 INTL 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: ERST 0x000000007B7F8000 000230 (v01 HP ProLiant 00000001 INTL 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: EINJ 0x000000007B7F7000 000150 (v01 HP ProLiant 00000001 INTL 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: HPET 0x000000007B7F5000 000038 (v01 HP ProLiant 00000001 HP 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PMCT 0x000000007B7F4000 000064 (v01 HP ProLiant 00000001 HP 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: WDDT 0x000000007B7F3000 000040 (v01 HP ProLiant 00000001 HP 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: APIC 0x000000007B7F2000 00084A (v03 HP ProLiant 00000001 HP 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: MCFG 0x000000007B7F1000 00003C (v01 HP ProLiant 00000001 HP 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: SLIT 0x000000007B7F0000 000030 (v01 HP ProLiant 00000001 HP 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: SRAT 0x000000007B7EF000 000790 (v03 HP ProLiant 00000001 HP 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: SPMI 0x000000007B7EE000 000041 (v05 HP ProLiant 00000001 HP 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: RASF 0x000000007B7ED000 000030 (v01 HP ProLiant 00000001 HP 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: SPCR 0x000000007B7EC000 000050 (v01 HP ProLiant 00000001 HP 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: MSCT 0x000000007B7EB000 000064 (v01 HP ProLiant 00000001 HP 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: BDAT 0x000000007B7EA000 000030 (v01 HP ProLiant 00000001 HP 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PCCT 0x000000007B7E9000 00006E (v01 HP ProLiant 00000001 HP 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: SSDT 0x000000007B7D0000 007116 (v02 HP PCISSDT 00000002 HPAG 00020000) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: SSDT 0x000000007B7CF000 0001CB (v02 HP TIMESSDT 00000002 HPAG 00020000) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: SSDT 0x000000007B7CE000 0002F2 (v01 HP pmab 00000001 INTL 20130328) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: DMAR 0x000000007B7FD000 0002D8 (v01 INTEL INTEL ID 00000001 ? 00000001) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving FACP table memory at [mem 0x7b7f6000-0x7b7f610b] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving DSDT table memory at [mem 0x7b7d8000-0x7b7e76ec] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving FACS table memory at [mem 0x7b4a1000-0x7b4a103f] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving UEFI table memory at [mem 0x7b4d9000-0x7b4d9041] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving MCEJ table memory at [mem 0x7b7fc000-0x7b7fc12f] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving SSDT table memory at [mem 0x7b7fb000-0x7b7fb063] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving HEST table memory at [mem 0x7b7fa000-0x7b7fa0a7] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving BERT table memory at [mem 0x7b7f9000-0x7b7f902f] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving ERST table memory at [mem 0x7b7f8000-0x7b7f822f] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving EINJ table memory at [mem 0x7b7f7000-0x7b7f714f] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving HPET table memory at [mem 0x7b7f5000-0x7b7f5037] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving PMCT table memory at [mem 0x7b7f4000-0x7b7f4063] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving WDDT table memory at [mem 0x7b7f3000-0x7b7f303f] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving APIC table memory at [mem 0x7b7f2000-0x7b7f2849] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving MCFG table memory at [mem 0x7b7f1000-0x7b7f103b] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving SLIT table memory at [mem 0x7b7f0000-0x7b7f002f] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving SRAT table memory at [mem 0x7b7ef000-0x7b7ef78f] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving SPMI table memory at [mem 0x7b7ee000-0x7b7ee040] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving RASF table memory at [mem 0x7b7ed000-0x7b7ed02f] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving SPCR table memory at [mem 0x7b7ec000-0x7b7ec04f] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving MSCT table memory at [mem 0x7b7eb000-0x7b7eb063] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving BDAT table memory at [mem 0x7b7ea000-0x7b7ea02f] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving PCCT table memory at [mem 0x7b7e9000-0x7b7e906d] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving SSDT table memory at [mem 0x7b7d0000-0x7b7d7115] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving SSDT table memory at [mem 0x7b7cf000-0x7b7cf1ca] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving SSDT table memory at [mem 0x7b7ce000-0x7b7ce2f1] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Reserving DMAR table memory at [mem 0x7b7fd000-0x7b7fd2d7] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0000 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0002 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0004 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0006 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0008 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0010 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0012 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0014 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0016 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0020 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0022 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0024 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0026 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0028 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0030 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0032 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0034 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0036 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0040 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0042 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0044 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0046 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0048 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0050 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0052 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0054 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0056 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0060 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0062 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0064 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0066 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0068 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0070 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0072 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0074 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0076 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0001 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0003 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0005 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0007 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0009 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0011 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0013 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0015 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0017 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0021 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0023 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0025 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0027 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0029 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0031 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0033 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0035 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 0 -> APIC 0x0037 -> Node 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0041 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0043 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0045 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0047 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0049 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0051 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0053 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0055 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0057 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0061 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0063 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0065 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0067 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0069 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0071 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0073 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0075 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SRAT: PXM 1 -> APIC 0x0077 -> Node 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x47fffffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x480000000-0x67fffffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: SRAT: Node 1 PXM 1 [mem 0x680000000-0xa7fffffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: SRAT: Node 1 PXM 1 [mem 0xa80000000-0xc7fffffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NUMA: Initialized distance table, cnt=2 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NUMA: Node 0 [mem 0x00000000-0x47fffffff] + [mem 0x480000000-0x67fffffff] -> [mem 0x00000000-0x67fffffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NUMA: Node 1 [mem 0x680000000-0xa7fffffff] + [mem 0xa80000000-0xc7fffffff] -> [mem 0x680000000-0xc7fffffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NODE_DATA(0) allocated [mem 0x67ffd4000-0x67fffffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NODE_DATA(1) allocated [mem 0xc7ffd3000-0xc7fffefff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Reserving 256MB of low memory at 1680MB for crashkernel (low RAM limit: 4096MB) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Reserving 2048MB of memory at 49136MB for crashkernel (System RAM: 49026MB) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Zone ranges: Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Normal [mem 0x0000000100000000-0x0000000c7fffffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Device empty Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Movable zone start for each node Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Early memory node ranges Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: node 0: [mem 0x0000000000001000-0x0000000000093fff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: node 0: [mem 0x0000000000100000-0x000000005a650fff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: node 0: [mem 0x000000005b4c1000-0x00000000790fefff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: node 0: [mem 0x000000007b7ff000-0x000000007b7fffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: node 0: [mem 0x0000000100000000-0x000000067fffffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: node 1: [mem 0x0000000680000000-0x0000000c7fffffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000067fffffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Initmem setup node 1 [mem 0x0000000680000000-0x0000000c7fffffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: On node 0, zone DMA: 1 pages in unavailable ranges Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: On node 0, zone DMA: 108 pages in unavailable ranges Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: On node 0, zone DMA32: 3696 pages in unavailable ranges Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: On node 0, zone DMA32: 9984 pages in unavailable ranges Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: On node 0, zone Normal: 18432 pages in unavailable ranges Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PM-Timer IO Port: 0x408 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x00] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x01] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x02] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x03] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x04] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x05] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x06] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x07] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x08] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x09] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x10] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x11] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x12] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x13] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x14] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x15] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x16] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x17] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x20] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x21] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x22] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x23] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x24] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x25] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x26] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x27] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x28] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x29] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x30] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x31] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x32] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x33] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x34] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x35] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x36] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x37] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x40] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x41] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x42] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x43] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x44] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x45] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x46] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x47] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x48] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x49] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x50] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x51] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x52] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x53] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x54] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x55] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x56] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x57] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x60] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x61] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x62] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x63] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x64] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x65] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x66] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x67] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x68] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x69] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x70] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x71] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x72] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x73] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x74] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x75] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x76] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: X2APIC_NMI (uid[0x77] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: LAPIC_NMI (acpi_id[0xff] high level lint[0x1]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: IOAPIC[0]: apic_id 8, version 32, address 0xfec00000, GSI 0-23 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: IOAPIC[1]: apic_id 9, version 32, address 0xfec01000, GSI 24-47 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: IOAPIC[2]: apic_id 10, version 32, address 0xfec40000, GSI 48-71 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Using ACPI (MADT) for SMP configuration information Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: HPET id: 0x8086a701 base: 0xfed00000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: SPCR: SPCR table version 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: SPCR: console: uart,mmio,0x0,115200 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: TSC deadline timer available Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: smpboot: Allowing 72 CPUs, 0 hotplug CPUs Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PM: hibernation: Registered nosave memory: [mem 0x00094000-0x0009ffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000dffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PM: hibernation: Registered nosave memory: [mem 0x000e0000-0x000fffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PM: hibernation: Registered nosave memory: [mem 0x5a651000-0x5b4c0fff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PM: hibernation: Registered nosave memory: [mem 0x790ff000-0x791fefff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PM: hibernation: Registered nosave memory: [mem 0x791ff000-0x7b5fefff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PM: hibernation: Registered nosave memory: [mem 0x7b5ff000-0x7b7fefff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PM: hibernation: Registered nosave memory: [mem 0x7b800000-0x8fffffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PM: hibernation: Registered nosave memory: [mem 0x90000000-0xff7fffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PM: hibernation: Registered nosave memory: [mem 0xff800000-0xffffffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: [mem 0x90000000-0xff7fffff] available for PCI devices Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Booting paravirtualized kernel on bare hardware Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:72 nr_cpu_ids:72 nr_node_ids:2 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: percpu: Embedded 506 pages/cpu s2035712 r8192 d28672 u2097152 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcpu-alloc: s2035712 r8192 d28672 u2097152 alloc=1*2097152 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcpu-alloc: [0] 00 [0] 01 [0] 02 [0] 03 [0] 04 [0] 05 [0] 06 [0] 07 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcpu-alloc: [0] 08 [0] 09 [0] 10 [0] 11 [0] 12 [0] 13 [0] 14 [0] 15 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcpu-alloc: [0] 16 [0] 17 [0] 36 [0] 37 [0] 38 [0] 39 [0] 40 [0] 41 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcpu-alloc: [0] 42 [0] 43 [0] 44 [0] 45 [0] 46 [0] 47 [0] 48 [0] 49 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcpu-alloc: [0] 50 [0] 51 [0] 52 [0] 53 [1] 18 [1] 19 [1] 20 [1] 21 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcpu-alloc: [1] 22 [1] 23 [1] 24 [1] 25 [1] 26 [1] 27 [1] 28 [1] 29 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcpu-alloc: [1] 30 [1] 31 [1] 32 [1] 33 [1] 34 [1] 35 [1] 54 [1] 55 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcpu-alloc: [1] 56 [1] 57 [1] 58 [1] 59 [1] 60 [1] 61 [1] 62 [1] 63 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcpu-alloc: [1] 64 [1] 65 [1] 66 [1] 67 [1] 68 [1] 69 [1] 70 [1] 71 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Fallback order for Node 0: 0 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Fallback order for Node 1: 1 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Built 2 zonelists, mobility grouping on. Total pages: 12354437 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Policy zone: Normal Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/vmlinuz-5.14.0-246.rt14.245.1955_759844798.el9.x86_64+debug root=/dev/mapper/cs_hpe--ml350gen9--01-root ro efi=runtime resume=/dev/mapper/cs_hpe--ml350gen9--01-swap rd.lvm.lv=cs_hpe-ml350gen9-01/root rd.lvm.lv=cs_hpe-ml350gen9-01/swap console=ttyS1,115200n81 crashkernel=1G-2G:384M,2G-3G:512M,3G-4G:768M,4G-16G:1G,16G-64G:2G,64G-128G:2G,128G-:4G Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/vmlinuz-5.14.0-246.rt14.245.1955_759844798.el9.x86_64+debug", will be passed to user space. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: software IO TLB: area num 128. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Memory: 1868332K/50202764K available (16390K kernel code, 6601K rwdata, 10528K rodata, 4644K init, 61648K bss, 3501016K reserved, 0K cma-reserved) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: random: get_random_u64 called from kmem_cache_open+0x22/0x300 with crng_init=0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=72, Nodes=2 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: kmemleak: Kernel memory leak detector disabled Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Kernel/User page tables isolation: enabled Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ftrace: allocating 45129 entries in 177 pages Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ftrace: allocated 177 pages with 4 groups Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Running RCU self tests Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ODEBUG: Out of memory. ODEBUG disabled Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: rcu: Preemptible hierarchical RCU implementation. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: rcu: RCU lockdep checking is enabled. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: rcu: RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=72. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: rcu: RCU priority boosting: priority 1 delay 500 ms. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: rcu: RCU callback double-/use-after-free debug is enabled. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: rcu: RCU_SOFTIRQ processing moved to rcuc kthreads. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: No expedited grace period (rcu_normal_after_boot). Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Trampoline variant of Tasks RCU enabled. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Rude variant of Tasks RCU enabled. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Tracing variant of Tasks RCU enabled. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=72 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NR_IRQS: 524544, nr_irqs: 1816, preallocated irqs: 16 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: random: crng init done (trusting CPU's manufacturer) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Console: colour VGA+ 80x25 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: printk: console [ttyS1] enabled Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Lock dependency validator: Copyright (c) 2006 Red Hat, Inc., Ingo Molnar Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ... MAX_LOCKDEP_SUBCLASSES: 8 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ... MAX_LOCK_DEPTH: 48 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ... MAX_LOCKDEP_KEYS: 8192 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ... CLASSHASH_SIZE: 4096 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ... MAX_LOCKDEP_ENTRIES: 131072 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ... MAX_LOCKDEP_CHAINS: 524288 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ... CHAINHASH_SIZE: 262144 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: memory used by lock dependency info: 33041 kB Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: memory used for stack traces: 4224 kB Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: per task-struct memory footprint: 2688 bytes Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Core revision 20211217 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: APIC: Switch to symmetric I/O mode setup Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMAR: Host address width 46 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMAR: DRHD base: 0x000000fbffc000 flags: 0x0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMAR: dmar0: reg_base_addr fbffc000 ver 1:0 cap d2078c106f0466 ecap f020df Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMAR: DRHD base: 0x000000c7ffc000 flags: 0x1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMAR: dmar1: reg_base_addr c7ffc000 ver 1:0 cap d2078c106f0466 ecap f020df Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMAR: RMRR base: 0x00000079172000 end: 0x00000079174fff Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMAR: RMRR base: 0x000000791f6000 end: 0x000000791f9fff Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMAR: RMRR base: 0x000000791e0000 end: 0x000000791f5fff Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMAR: RMRR base: 0x000000791cd000 end: 0x000000791ddfff Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMAR: RMRR base: 0x000000791de000 end: 0x000000791dffff Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMAR: RMRR base: 0x0000005a651000 end: 0x0000005a6c0fff Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMAR-IR: IOAPIC id 10 under DRHD base 0xfbffc000 IOMMU 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMAR-IR: IOAPIC id 8 under DRHD base 0xc7ffc000 IOMMU 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMAR-IR: IOAPIC id 9 under DRHD base 0xc7ffc000 IOMMU 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMAR-IR: HPET id 0 under DRHD base 0xc7ffc000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: x2apic enabled Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Switched APIC routing to cluster x2apic. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tsc: PIT calibration matches HPET. 1 loops Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tsc: Detected 2297.349 MHz processor Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x211d6ce5cb2, max_idle_ns: 440795262002 ns Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 4594.69 BogoMIPS (lpj=2297349) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pid_max: default: 73728 minimum: 576 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: LSM: Security Framework initializing Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Yama: becoming mindful. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SELinux: Initializing. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: LSM support for eBPF active Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Stack Depot allocating hash table with kvcalloc Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Dentry cache hash table entries: 8388608 (order: 14, 67108864 bytes, vmalloc hugepage) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Inode-cache hash table entries: 4194304 (order: 13, 33554432 bytes, vmalloc hugepage) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Mount-cache hash table entries: 131072 (order: 8, 1048576 bytes, vmalloc) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Mountpoint-cache hash table entries: 131072 (order: 8, 1048576 bytes, vmalloc) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: CPU0: Thermal monitoring enabled (TM1) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: process: using mwait in idle threads Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 0 node 0: mask now 0 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Last level iTLB entries: 4KB 1024, 2MB 1024, 4MB 1024 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Last level dTLB entries: 4KB 1024, 2MB 1024, 4MB 1024, 1GB 4 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Spectre V2 : Mitigation: Retpolines Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Spectre V2 : User space: Mitigation: STIBP via prctl Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: MDS: Mitigation: Clear CPU buffers Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Freeing SMP alternatives memory: 32K Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: smpboot: Estimated ratio of average max frequency by base frequency (times 1024): 1469 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: smpboot: CPU0: Intel(R) Xeon(R) CPU E5-2699 v3 @ 2.30GHz (family: 0x6, model: 0x3f, stepping: 0x2) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: cblist_init_generic: Setting adjustable number of callback queues. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: cblist_init_generic: Setting shift to 7 and lim to 1. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: cblist_init_generic: Setting shift to 7 and lim to 1. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: cblist_init_generic: Setting shift to 7 and lim to 1. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Running RCU-tasks wait API self tests Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Performance Events: PEBS fmt2+, Haswell events, 16-deep LBR, full-width counters, Broken BIOS detected, complain to your hardware vendor. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: [Firmware Bug]: the BIOS has corrupted hw-PMU resources (MSR 38d is 330) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Intel PMU driver. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ... version: 3 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ... bit width: 48 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ... generic registers: 4 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ... value mask: 0000ffffffffffff Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ... max period: 00007fffffffffff Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ... fixed-purpose events: 3 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ... event mask: 000000070000000f Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Callback from call_rcu_tasks_trace() invoked. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: rcu: Hierarchical SRCU implementation. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: rcu: Max phase no-delay instances is 400. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: printk: console [ttyS1] printing thread started Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: smp: Bringing up secondary CPUs ... Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: x86: Booting SMP configuration: Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: .... node #0, CPUs: #1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 1 node 0: mask now 0-1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #2 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 2 node 0: mask now 0-2 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #3 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 3 node 0: mask now 0-3 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Callback from call_rcu_tasks() invoked. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Callback from call_rcu_tasks_rude() invoked. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #4 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 4 node 0: mask now 0-4 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #5 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 5 node 0: mask now 0-5 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #6 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 6 node 0: mask now 0-6 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #7 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 7 node 0: mask now 0-7 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #8 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 8 node 0: mask now 0-8 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #9 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 9 node 0: mask now 0-9 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #10 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 10 node 0: mask now 0-10 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #11 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 11 node 0: mask now 0-11 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #12 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 12 node 0: mask now 0-12 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #13 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 13 node 0: mask now 0-13 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #14 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 14 node 0: mask now 0-14 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #15 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 15 node 0: mask now 0-15 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #16 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 16 node 0: mask now 0-16 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #17 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 17 node 0: mask now 0-17 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: .... node #1, CPUs: #18 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 18 node 1: mask now 18 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: smpboot: CPU 18 Converting physical 0 to logical die 1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #19 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 19 node 1: mask now 18-19 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #20 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 20 node 1: mask now 18-20 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #21 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 21 node 1: mask now 18-21 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #22 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 22 node 1: mask now 18-22 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #23 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 23 node 1: mask now 18-23 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #24 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 24 node 1: mask now 18-24 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #25 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 25 node 1: mask now 18-25 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #26 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 26 node 1: mask now 18-26 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #27 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 27 node 1: mask now 18-27 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #28 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 28 node 1: mask now 18-28 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #29 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 29 node 1: mask now 18-29 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #30 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 30 node 1: mask now 18-30 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #31 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 31 node 1: mask now 18-31 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #32 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 32 node 1: mask now 18-32 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #33 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 33 node 1: mask now 18-33 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #34 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 34 node 1: mask now 18-34 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #35 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 35 node 1: mask now 18-35 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: .... node #0, CPUs: #36 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 36 node 0: mask now 0-17,36 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: MDS CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/mds.html for more details. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #37 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 37 node 0: mask now 0-17,36-37 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #38 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 38 node 0: mask now 0-17,36-38 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #39 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 39 node 0: mask now 0-17,36-39 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #40 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 40 node 0: mask now 0-17,36-40 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #41 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 41 node 0: mask now 0-17,36-41 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #42 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 42 node 0: mask now 0-17,36-42 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #43 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 43 node 0: mask now 0-17,36-43 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #44 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 44 node 0: mask now 0-17,36-44 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #45 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 45 node 0: mask now 0-17,36-45 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #46 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 46 node 0: mask now 0-17,36-46 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #47 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 47 node 0: mask now 0-17,36-47 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #48 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 48 node 0: mask now 0-17,36-48 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #49 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 49 node 0: mask now 0-17,36-49 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #50 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 50 node 0: mask now 0-17,36-50 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #51 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 51 node 0: mask now 0-17,36-51 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #52 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 52 node 0: mask now 0-17,36-52 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #53 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 53 node 0: mask now 0-17,36-53 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: .... node #1, CPUs: #54 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 54 node 1: mask now 18-35,54 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #55 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 55 node 1: mask now 18-35,54-55 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #56 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 56 node 1: mask now 18-35,54-56 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #57 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 57 node 1: mask now 18-35,54-57 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #58 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 58 node 1: mask now 18-35,54-58 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #59 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 59 node 1: mask now 18-35,54-59 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #60 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 60 node 1: mask now 18-35,54-60 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #61 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 61 node 1: mask now 18-35,54-61 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #62 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 62 node 1: mask now 18-35,54-62 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #63 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 63 node 1: mask now 18-35,54-63 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #64 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 64 node 1: mask now 18-35,54-64 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #65 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 65 node 1: mask now 18-35,54-65 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #66 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 66 node 1: mask now 18-35,54-66 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #67 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 67 node 1: mask now 18-35,54-67 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #68 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 68 node 1: mask now 18-35,54-68 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #69 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 69 node 1: mask now 18-35,54-69 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #70 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 70 node 1: mask now 18-35,54-70 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: #71 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: numa_add_cpu cpu 71 node 1: mask now 18-35,54-71 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: smp: Brought up 2 nodes, 72 CPUs Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: smpboot: Max logical packages: 2 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: smpboot: Total of 72 processors activated (331965.36 BogoMIPS) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: node 1 deferred pages initialised in 163ms Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pgdatinit1 (603) used greatest stack depth: 13840 bytes left Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: node 0 deferred pages initialised in 188ms Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pgdatinit0 (602) used greatest stack depth: 13328 bytes left Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: devtmpfs: initialized Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: x86/mm: Memory block size: 128MB Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PM: Registering ACPI NVS region [mem 0x791ff000-0x7b5fefff] (37748736 bytes) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMA-API: preallocated 65536 debug entries Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMA-API: debugging enabled by kernel config Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: futex hash table entries: 32768 (order: 10, 6291456 bytes, vmalloc hugepage) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: prandom: seed boundary self test passed Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: prandom: 100 self tests passed Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: prandom32: self test passed (less than 6 bits correlated) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pinctrl core: initialized pinctrl subsystem Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ************************************************************* Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ** NOTICE NOTICE NOTICE NOTICE NOTICE NOTICE NOTICE ** Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ** ** Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ** IOMMU DebugFS SUPPORT HAS BEEN ENABLED IN THIS KERNEL ** Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ** ** Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ** This means that this kernel is built to expose internal ** Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ** IOMMU data structures, which may compromise security on ** Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ** your system. ** Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ** ** Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ** If you see this message and you are not debugging the ** Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ** kernel, report this immediately to your vendor! ** Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ** ** Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ** NOTICE NOTICE NOTICE NOTICE NOTICE NOTICE NOTICE ** Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ************************************************************* Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PM: RTC time: 16:11:20, date: 2023-01-27 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMA: preallocated 4096 KiB GFP_KERNEL pool for atomic allocations Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMA: preallocated 4096 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: DMA: preallocated 4096 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: audit: initializing netlink subsys (disabled) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: audit: type=2000 audit(1674835871.495:1): state=initialized audit_enabled=0 res=1 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: thermal_sys: Registered thermal governor 'fair_share' Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: thermal_sys: Registered thermal governor 'step_wise' Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: thermal_sys: Registered thermal governor 'user_space' Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: cpuidle: using governor menu Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Detected 1 PCC Subspaces Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Registering PCC driver as Mailbox controller Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI FADT declares the system doesn't support PCIe ASPM, so disable it Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0x80000000-0x8fffffff] (base 0x80000000) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PCI: MMCONFIG at [mem 0x80000000-0x8fffffff] reserved in E820 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PCI: Using configuration type 1 for base access Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: core: PMU erratum BJ122, BV98, HSD29 worked around, HT is on Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: cryptd: max_cpu_qlen set to 1000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Added _OSI(Module Device) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Added _OSI(Processor Device) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Added _OSI(Processor Aggregator Device) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Added _OSI(Linux-Dell-Video) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: 5 ACPI AML tables successfully acquired and loaded Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Interpreter enabled Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PM: (supports S0 S5) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Using IOAPIC for interrupt routing Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: HEST: Table parsing has been initialized. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PCI: Using E820 reservations for host bridge windows Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: Enabled 10 GPEs in block 00 to 3F Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PCI Root Bridge [UNC0] (domain 0000 [bus 7f]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PCI host bridge to bus 0000:7f Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:7f: Unknown NUMA node; performance will be reduced Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:7f: root bus resource [bus 7f] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:08.0: [8086:2f80] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:08.3: [8086:2f83] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:09.0: [8086:2f90] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:09.3: [8086:2f93] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0b.0: [8086:2f81] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0b.1: [8086:2f36] type 00 class 0x110100 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0b.2: [8086:2f37] type 00 class 0x110100 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0c.0: [8086:2fe0] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0c.1: [8086:2fe1] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0c.2: [8086:2fe2] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0c.3: [8086:2fe3] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0c.4: [8086:2fe4] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0c.5: [8086:2fe5] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0c.6: [8086:2fe6] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0c.7: [8086:2fe7] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0d.0: [8086:2fe8] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0d.1: [8086:2fe9] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0d.2: [8086:2fea] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0d.3: [8086:2feb] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0d.4: [8086:2fec] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0d.5: [8086:2fed] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0d.6: [8086:2fee] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0d.7: [8086:2fef] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0e.0: [8086:2ff0] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0e.1: [8086:2ff1] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0f.0: [8086:2ff8] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0f.1: [8086:2ff9] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0f.2: [8086:2ffa] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0f.3: [8086:2ffb] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0f.4: [8086:2ffc] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0f.5: [8086:2ffd] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:0f.6: [8086:2ffe] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:10.0: [8086:2f1d] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:10.1: [8086:2f34] type 00 class 0x110100 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:10.5: [8086:2f1e] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:10.6: [8086:2f7d] type 00 class 0x110100 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:10.7: [8086:2f1f] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:12.0: [8086:2fa0] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:12.1: [8086:2f30] type 00 class 0x110100 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:12.2: [8086:2f70] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:12.4: [8086:2f60] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:12.5: [8086:2f38] type 00 class 0x110100 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:12.6: [8086:2f78] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:13.0: [8086:2fa8] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:13.1: [8086:2f71] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:13.2: [8086:2faa] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:13.3: [8086:2fab] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:13.6: [8086:2fae] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:13.7: [8086:2faf] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:14.0: [8086:2fb0] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:14.1: [8086:2fb1] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:14.2: [8086:2fb2] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:14.3: [8086:2fb3] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:14.4: [8086:2fbc] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:14.5: [8086:2fbd] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:14.6: [8086:2fbe] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:14.7: [8086:2fbf] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:16.0: [8086:2f68] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:16.1: [8086:2f79] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:16.2: [8086:2f6a] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:16.3: [8086:2f6b] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:16.6: [8086:2f6e] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:16.7: [8086:2f6f] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:17.0: [8086:2fd0] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:17.1: [8086:2fd1] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:17.2: [8086:2fd2] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:17.3: [8086:2fd3] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:17.4: [8086:2fb8] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:17.5: [8086:2fb9] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:17.6: [8086:2fba] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:17.7: [8086:2fbb] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:1e.0: [8086:2f98] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:1e.1: [8086:2f99] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:1e.2: [8086:2f9a] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:1e.3: [8086:2fc0] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:1e.4: [8086:2f9c] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:1f.0: [8086:2f88] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:7f:1f.2: [8086:2f8a] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PCI Root Bridge [UNC1] (domain 0000 [bus ff]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: acpi PNP0A03:01: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PCI host bridge to bus 0000:ff Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:ff: Unknown NUMA node; performance will be reduced Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:ff: root bus resource [bus ff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:08.0: [8086:2f80] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:08.3: [8086:2f83] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:09.0: [8086:2f90] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:09.3: [8086:2f93] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0b.0: [8086:2f81] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0b.1: [8086:2f36] type 00 class 0x110100 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0b.2: [8086:2f37] type 00 class 0x110100 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0c.0: [8086:2fe0] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0c.1: [8086:2fe1] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0c.2: [8086:2fe2] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0c.3: [8086:2fe3] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0c.4: [8086:2fe4] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0c.5: [8086:2fe5] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0c.6: [8086:2fe6] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0c.7: [8086:2fe7] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0d.0: [8086:2fe8] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0d.1: [8086:2fe9] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0d.2: [8086:2fea] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0d.3: [8086:2feb] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0d.4: [8086:2fec] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0d.5: [8086:2fed] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0d.6: [8086:2fee] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0d.7: [8086:2fef] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0e.0: [8086:2ff0] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0e.1: [8086:2ff1] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0f.0: [8086:2ff8] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0f.1: [8086:2ff9] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0f.2: [8086:2ffa] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0f.3: [8086:2ffb] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0f.4: [8086:2ffc] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0f.5: [8086:2ffd] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:0f.6: [8086:2ffe] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:10.0: [8086:2f1d] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:10.1: [8086:2f34] type 00 class 0x110100 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:10.5: [8086:2f1e] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:10.6: [8086:2f7d] type 00 class 0x110100 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:10.7: [8086:2f1f] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:12.0: [8086:2fa0] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:12.1: [8086:2f30] type 00 class 0x110100 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:12.2: [8086:2f70] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:12.4: [8086:2f60] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:12.5: [8086:2f38] type 00 class 0x110100 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:12.6: [8086:2f78] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:13.0: [8086:2fa8] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:13.1: [8086:2f71] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:13.2: [8086:2faa] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:13.3: [8086:2fab] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:13.6: [8086:2fae] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:13.7: [8086:2faf] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:14.0: [8086:2fb0] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:14.1: [8086:2fb1] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:14.2: [8086:2fb2] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:14.3: [8086:2fb3] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:14.4: [8086:2fbc] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:14.5: [8086:2fbd] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:14.6: [8086:2fbe] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:14.7: [8086:2fbf] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:16.0: [8086:2f68] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:16.1: [8086:2f79] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:16.2: [8086:2f6a] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:16.3: [8086:2f6b] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:16.6: [8086:2f6e] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:16.7: [8086:2f6f] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:17.0: [8086:2fd0] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:17.1: [8086:2fd1] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:17.2: [8086:2fd2] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:17.3: [8086:2fd3] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:17.4: [8086:2fb8] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:17.5: [8086:2fb9] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:17.6: [8086:2fba] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:17.7: [8086:2fbb] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:1e.0: [8086:2f98] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:1e.1: [8086:2f99] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:1e.2: [8086:2f9a] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:1e.3: [8086:2fc0] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:1e.4: [8086:2f9c] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:1f.0: [8086:2f88] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:ff:1f.2: [8086:2f8a] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7e]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: acpi PNP0A08:00: _OSC: platform does not support [SHPCHotplug AER LTR DPC] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: acpi PNP0A08:00: FADT indicates ASPM is unsupported, using BIOS configuration Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: acpi PNP0A08:00: ignoring host bridge window [mem 0x000c4000-0x000cbfff window] (conflicts with Video ROM [mem 0x000c0000-0x000c7fff]) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PCI host bridge to bus 0000:00 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x03af window] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: root bus resource [io 0x03e0-0x0cf7 window] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: root bus resource [io 0x03b0-0x03bb window] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: root bus resource [io 0x03c0-0x03df window] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: root bus resource [io 0x1000-0x9fff window] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: root bus resource [mem 0x90000000-0xc7ffbfff window] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: root bus resource [mem 0x38000000000-0x39fffffffff window] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: root bus resource [bus 00-7e] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:00.0: [8086:2f00] type 00 class 0x060000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:01.0: [8086:2f02] type 01 class 0x060400 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:01.1: [8086:2f03] type 01 class 0x060400 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:01.1: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.0: [8086:2f04] type 01 class 0x060400 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.0: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.1: [8086:2f05] type 01 class 0x060400 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.1: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.2: [8086:2f06] type 01 class 0x060400 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.2: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.3: [8086:2f07] type 01 class 0x060400 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.3: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:03.0: [8086:2f08] type 01 class 0x060400 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:03.0: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:03.1: [8086:2f09] type 01 class 0x060400 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:03.1: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:03.2: [8086:2f0a] type 01 class 0x060400 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:03.2: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:03.3: [8086:2f0b] type 01 class 0x060400 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:03.3: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:04.0: [8086:2f20] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:04.0: reg 0x10: [mem 0x39ffff2c000-0x39ffff2ffff 64bit] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:04.1: [8086:2f21] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:04.1: reg 0x10: [mem 0x39ffff28000-0x39ffff2bfff 64bit] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:04.2: [8086:2f22] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:04.2: reg 0x10: [mem 0x39ffff24000-0x39ffff27fff 64bit] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:04.3: [8086:2f23] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:04.3: reg 0x10: [mem 0x39ffff20000-0x39ffff23fff 64bit] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:04.4: [8086:2f24] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:04.4: reg 0x10: [mem 0x39ffff1c000-0x39ffff1ffff 64bit] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:04.5: [8086:2f25] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:04.5: reg 0x10: [mem 0x39ffff18000-0x39ffff1bfff 64bit] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:04.6: [8086:2f26] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:04.6: reg 0x10: [mem 0x39ffff14000-0x39ffff17fff 64bit] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:04.7: [8086:2f27] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:04.7: reg 0x10: [mem 0x39ffff10000-0x39ffff13fff 64bit] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:05.0: [8086:2f28] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:05.1: [8086:2f29] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:05.2: [8086:2f2a] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:05.4: [8086:2f2c] type 00 class 0x080020 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:05.4: reg 0x10: [mem 0x92e03000-0x92e03fff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:11.0: [8086:8d7c] type 00 class 0xff0000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:14.0: [8086:8d31] type 00 class 0x0c0330 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:14.0: reg 0x10: [mem 0x39ffff00000-0x39ffff0ffff 64bit] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1a.0: [8086:8d2d] type 00 class 0x0c0320 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1a.0: reg 0x10: [mem 0x92e01000-0x92e013ff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1a.0: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.0: [8086:8d10] type 01 class 0x060401 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.2: [8086:8d14] type 01 class 0x060400 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.2: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.4: [8086:8d18] type 01 class 0x060400 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.4: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.6: [8086:8d1c] type 01 class 0x060400 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.6: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.7: [8086:8d1e] type 01 class 0x060400 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.7: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1d.0: [8086:8d26] type 00 class 0x0c0320 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1d.0: reg 0x10: [mem 0x92e00000-0x92e003ff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1d.0: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1f.0: [8086:8d44] type 00 class 0x060100 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1f.3: [8086:8d22] type 00 class 0x0c0500 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1f.3: reg 0x10: [mem 0x39ffff31000-0x39ffff310ff 64bit] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1f.3: reg 0x20: [io 0x3000-0x301f] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:01.0: PCI bridge to [bus 07] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:01.1: PCI bridge to [bus 0d] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.0: PCI bridge to [bus 04] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.1: PCI bridge to [bus 10] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:03:00.0: [103c:3239] type 00 class 0x010400 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:03:00.0: reg 0x10: [mem 0x92c00000-0x92cfffff 64bit] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:03:00.0: reg 0x18: [mem 0x92d00000-0x92d003ff 64bit] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:03:00.0: reg 0x20: [io 0x2000-0x20ff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0007ffff pref] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:03:00.0: PME# supported from D0 D1 D3hot Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:03:00.0: 32.000 Gb/s available PCIe bandwidth, limited by 5.0 GT/s PCIe x8 link at 0000:00:02.2 (capable of 63.008 Gb/s with 8.0 GT/s PCIe x8 link) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.2: bridge window [mem 0x92c00000-0x92dfffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.3: PCI bridge to [bus 11] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.0: PCI bridge to [bus 15] (subtractive decode) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.0: bridge window [io 0x0000-0x03af window] (subtractive decode) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.0: bridge window [io 0x03e0-0x0cf7 window] (subtractive decode) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.0: bridge window [io 0x03b0-0x03bb window] (subtractive decode) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.0: bridge window [io 0x03c0-0x03df window] (subtractive decode) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.0: bridge window [io 0x1000-0x9fff window] (subtractive decode) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.0: bridge window [mem 0x90000000-0xc7ffbfff window] (subtractive decode) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.0: bridge window [mem 0x38000000000-0x39fffffffff window] (subtractive decode) Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.0: [103c:3306] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.0: reg 0x10: [io 0x1200-0x12ff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.0: reg 0x14: [mem 0x92a8d000-0x92a8d1ff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.0: reg 0x18: [io 0x1100-0x11ff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.1: [102b:0533] type 00 class 0x030000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.1: reg 0x10: [mem 0x91000000-0x91ffffff pref] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.1: reg 0x14: [mem 0x92a88000-0x92a8bfff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.1: reg 0x18: [mem 0x92000000-0x927fffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.1: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.2: [103c:3307] type 00 class 0x088000 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.2: reg 0x10: [io 0x1000-0x10ff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.2: reg 0x14: [mem 0x92a8c000-0x92a8c0ff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.2: reg 0x18: [mem 0x92900000-0x929fffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.2: reg 0x1c: [mem 0x92a00000-0x92a7ffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.2: reg 0x20: [mem 0x92a80000-0x92a87fff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.2: reg 0x24: [mem 0x92800000-0x928fffff] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.2: reg 0x30: [mem 0x00000000-0x0000ffff pref] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.2: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.4: [103c:3300] type 00 class 0x0c0300 Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.4: reg 0x20: [io 0x1300-0x131f] Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.2: PCI bridge to [bus 01] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.2: bridge window [io 0x1000-0x1fff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.2: bridge window [mem 0x90000000-0x92afffff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.0: [14e4:1657] type 00 class 0x020000 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.0: reg 0x10: [mem 0x92b90000-0x92b9ffff 64bit pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.0: reg 0x18: [mem 0x92ba0000-0x92baffff 64bit pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.0: reg 0x20: [mem 0x92bb0000-0x92bbffff 64bit pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.0: reg 0x30: [mem 0x00000000-0x0003ffff pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.0: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.0: 8.000 Gb/s available PCIe bandwidth, limited by 5.0 GT/s PCIe x2 link at 0000:00:1c.4 (capable of 16.000 Gb/s with 5.0 GT/s PCIe x4 link) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.1: [14e4:1657] type 00 class 0x020000 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.1: reg 0x10: [mem 0x92b60000-0x92b6ffff 64bit pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.1: reg 0x18: [mem 0x92b70000-0x92b7ffff 64bit pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.1: reg 0x20: [mem 0x92b80000-0x92b8ffff 64bit pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.1: reg 0x30: [mem 0x00000000-0x0003ffff pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.1: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.2: [14e4:1657] type 00 class 0x020000 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.2: reg 0x10: [mem 0x92b30000-0x92b3ffff 64bit pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.2: reg 0x18: [mem 0x92b40000-0x92b4ffff 64bit pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.2: reg 0x20: [mem 0x92b50000-0x92b5ffff 64bit pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.2: reg 0x30: [mem 0x00000000-0x0003ffff pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.2: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.3: [14e4:1657] type 00 class 0x020000 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.3: reg 0x10: [mem 0x92b00000-0x92b0ffff 64bit pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.3: reg 0x18: [mem 0x92b10000-0x92b1ffff 64bit pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.3: reg 0x20: [mem 0x92b20000-0x92b2ffff 64bit pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.3: reg 0x30: [mem 0x00000000-0x0003ffff pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.3: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.4: PCI bridge to [bus 02] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.4: bridge window [mem 0x92b00000-0x92bfffff 64bit pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.6: PCI bridge to [bus 16] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.7: PCI bridge to [bus 17] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: on NUMA node 0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 11 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 9 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 5 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PCI: Interrupt link LNKE disabled Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PCI: Interrupt link LNKF disabled Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PCI: Interrupt link LNKG disabled Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PCI: Interrupt link LNKH disabled Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: PCI Root Bridge [PCI1] (domain 0000 [bus 80-fe]) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: acpi PNP0A08:01: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: acpi PNP0A08:01: _OSC: platform does not support [SHPCHotplug AER LTR DPC] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: acpi PNP0A08:01: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: acpi PNP0A08:01: FADT indicates ASPM is unsupported, using BIOS configuration Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PCI host bridge to bus 0000:80 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:80: root bus resource [io 0xa000-0xffff window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:80: root bus resource [mem 0xc8000000-0xfbffbfff window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:80: root bus resource [mem 0x3a000000000-0x3bfffffffff window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:80: root bus resource [bus 80-fe] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:00.0: [8086:2f01] type 01 class 0x060400 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:00.0: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:01.0: [8086:2f02] type 01 class 0x060400 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:01.0: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:01.1: [8086:2f03] type 01 class 0x060400 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:01.1: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:02.0: [8086:2f04] type 01 class 0x060400 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:02.0: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:02.1: [8086:2f05] type 01 class 0x060400 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:02.1: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:02.2: [8086:2f06] type 01 class 0x060400 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:02.2: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:02.3: [8086:2f07] type 01 class 0x060400 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:02.3: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:03.0: [8086:2f08] type 01 class 0x060400 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:03.0: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:03.1: [8086:2f09] type 01 class 0x060400 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:03.1: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:03.2: [8086:2f0a] type 01 class 0x060400 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:03.2: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:03.3: [8086:2f0b] type 01 class 0x060400 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:03.3: PME# supported from D0 D3hot D3cold Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:04.0: [8086:2f20] type 00 class 0x088000 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:04.0: reg 0x10: [mem 0x3bffff1c000-0x3bffff1ffff 64bit] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:04.1: [8086:2f21] type 00 class 0x088000 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:04.1: reg 0x10: [mem 0x3bffff18000-0x3bffff1bfff 64bit] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:04.2: [8086:2f22] type 00 class 0x088000 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:04.2: reg 0x10: [mem 0x3bffff14000-0x3bffff17fff 64bit] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:04.3: [8086:2f23] type 00 class 0x088000 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:04.3: reg 0x10: [mem 0x3bffff10000-0x3bffff13fff 64bit] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:04.4: [8086:2f24] type 00 class 0x088000 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:04.4: reg 0x10: [mem 0x3bffff0c000-0x3bffff0ffff 64bit] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:04.5: [8086:2f25] type 00 class 0x088000 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:04.5: reg 0x10: [mem 0x3bffff08000-0x3bffff0bfff 64bit] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:04.6: [8086:2f26] type 00 class 0x088000 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:04.6: reg 0x10: [mem 0x3bffff04000-0x3bffff07fff 64bit] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:04.7: [8086:2f27] type 00 class 0x088000 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:04.7: reg 0x10: [mem 0x3bffff00000-0x3bffff03fff 64bit] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:05.0: [8086:2f28] type 00 class 0x088000 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:05.1: [8086:2f29] type 00 class 0x088000 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:05.2: [8086:2f2a] type 00 class 0x088000 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:05.4: [8086:2f2c] type 00 class 0x080020 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:05.4: reg 0x10: [mem 0xc8000000-0xc8000fff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:00.0: PCI bridge to [bus 81] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:01.0: PCI bridge to [bus 8d] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:01.1: PCI bridge to [bus 87] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:02.0: PCI bridge to [bus 84] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:02.1: PCI bridge to [bus 90] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:02.2: PCI bridge to [bus 91] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:02.3: PCI bridge to [bus 92] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:03.0: PCI bridge to [bus 8a] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:03.1: PCI bridge to [bus 93] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:03.2: PCI bridge to [bus 94] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:03.3: PCI bridge to [bus 95] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:80: on NUMA node 1 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: iommu: Default domain type: Translated Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: iommu: DMA domain TLB invalidation policy: lazy mode Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SCSI subsystem initialized Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: bus type USB registered Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usbcore: registered new interface driver usbfs Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usbcore: registered new interface driver hub Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usbcore: registered new device driver usb Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pps_core: LinuxPPS API ver. 1 registered Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PTP clock support registered Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC MC: Ver: 3.0.0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_mc_sysfs_init: device mc created Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NetLabel: Initializing Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NetLabel: domain hash size = 128 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NetLabel: protocols = UNLABELED CIPSOv4 CALIPSO Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NetLabel: unlabeled traffic allowed by default Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PCI: Using ACPI for IRQ routing Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PCI: pci_cache_line_size set to 64 bytes Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: e820: reserve RAM buffer [mem 0x00094000-0x0009ffff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: e820: reserve RAM buffer [mem 0x5a651000-0x5bffffff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: e820: reserve RAM buffer [mem 0x790ff000-0x7bffffff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: e820: reserve RAM buffer [mem 0x7b800000-0x7bffffff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.1: vgaarb: setting as boot VGA device Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.1: vgaarb: bridge control possible Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.1: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: vgaarb: loaded Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hpet0: 8 comparators, 64-bit 14.318180 MHz counter Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: clocksource: Switched to clocksource tsc-early Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: VFS: Disk quotas dquot_6.6.0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pnp: PnP ACPI init Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: system 00:01: [io 0x0500-0x053f] has been reserved Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: system 00:01: [io 0x0400-0x047f] has been reserved Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: system 00:01: [io 0x0540-0x057f] has been reserved Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: system 00:01: [io 0x0600-0x061f] has been reserved Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: system 00:01: [io 0x0880-0x0883] has been reserved Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: system 00:01: [io 0x0800-0x081f] has been reserved Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: system 00:01: [mem 0xfed1c000-0xfed3ffff] has been reserved Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: system 00:01: [mem 0xfed45000-0xfed8bfff] has been reserved Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: system 00:01: [mem 0xff000000-0xffffffff] could not be reserved Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: system 00:01: [mem 0xfee00000-0xfeefffff] has been reserved Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: system 00:01: [mem 0xfed12000-0xfed1200f] has been reserved Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: system 00:01: [mem 0xfed12010-0xfed1201f] has been reserved Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: system 00:01: [mem 0xfed1b000-0xfed1bfff] has been reserved Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pnp: PnP ACPI: found 4 devices Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NET: Registered PF_INET protocol family Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, vmalloc) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tcp_listen_portaddr_hash hash table entries: 32768 (order: 10, 4980736 bytes, vmalloc hugepage) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, vmalloc) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: TCP established hash table entries: 524288 (order: 10, 4194304 bytes, vmalloc hugepage) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: TCP bind hash table entries: 65536 (order: 11, 9961472 bytes, vmalloc hugepage) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: TCP: Hash tables configured (established 524288 bind 65536) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: MPTCP token hash table entries: 65536 (order: 11, 11010048 bytes, vmalloc hugepage) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: UDP hash table entries: 32768 (order: 11, 10485760 bytes, vmalloc hugepage) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: UDP-Lite hash table entries: 32768 (order: 11, 10485760 bytes, vmalloc hugepage) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NET: Registered PF_XDP protocol family Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.4: BAR 14: assigned [mem 0x92f00000-0x92ffffff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:01.0: PCI bridge to [bus 07] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:01.1: PCI bridge to [bus 0d] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.0: PCI bridge to [bus 04] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.1: PCI bridge to [bus 10] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0x92d80000-0x92dfffff pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.2: bridge window [mem 0x92c00000-0x92dfffff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:02.3: PCI bridge to [bus 11] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.0: PCI bridge to [bus 15] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.2: BAR 6: assigned [mem 0x90000000-0x9000ffff pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.2: PCI bridge to [bus 01] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.2: bridge window [io 0x1000-0x1fff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.2: bridge window [mem 0x90000000-0x92afffff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.0: BAR 6: assigned [mem 0x92f00000-0x92f3ffff pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.1: BAR 6: assigned [mem 0x92f40000-0x92f7ffff pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.2: BAR 6: assigned [mem 0x92f80000-0x92fbffff pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:02:00.3: BAR 6: assigned [mem 0x92fc0000-0x92ffffff pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.4: PCI bridge to [bus 02] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.4: bridge window [mem 0x92f00000-0x92ffffff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.4: bridge window [mem 0x92b00000-0x92bfffff 64bit pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.6: PCI bridge to [bus 16] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1c.7: PCI bridge to [bus 17] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x03af window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: resource 5 [io 0x03e0-0x0cf7 window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: resource 6 [io 0x03b0-0x03bb window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: resource 7 [io 0x03c0-0x03df window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: resource 8 [mem 0x000a0000-0x000bffff window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: resource 9 [io 0x1000-0x9fff window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: resource 10 [mem 0x90000000-0xc7ffbfff window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:00: resource 11 [mem 0x38000000000-0x39fffffffff window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:03: resource 1 [mem 0x92c00000-0x92dfffff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:15: resource 4 [io 0x0000-0x03af window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:15: resource 5 [io 0x03e0-0x0cf7 window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:15: resource 6 [io 0x03b0-0x03bb window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:15: resource 7 [io 0x03c0-0x03df window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:15: resource 8 [mem 0x000a0000-0x000bffff window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:15: resource 9 [io 0x1000-0x9fff window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:15: resource 10 [mem 0x90000000-0xc7ffbfff window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:15: resource 11 [mem 0x38000000000-0x39fffffffff window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:01: resource 1 [mem 0x90000000-0x92afffff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:02: resource 1 [mem 0x92f00000-0x92ffffff] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:02: resource 2 [mem 0x92b00000-0x92bfffff 64bit pref] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:00.0: PCI bridge to [bus 81] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:01.0: PCI bridge to [bus 8d] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:01.1: PCI bridge to [bus 87] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:02.0: PCI bridge to [bus 84] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:02.1: PCI bridge to [bus 90] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:02.2: PCI bridge to [bus 91] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:02.3: PCI bridge to [bus 92] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:03.0: PCI bridge to [bus 8a] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:03.1: PCI bridge to [bus 93] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:03.2: PCI bridge to [bus 94] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:03.3: PCI bridge to [bus 95] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:80: resource 4 [io 0xa000-0xffff window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:80: resource 5 [mem 0xc8000000-0xfbffbfff window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci_bus 0000:80: resource 6 [mem 0x3a000000000-0x3bfffffffff window] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:05.0: disabled boot interrupts on device [8086:2f28] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:14.0: quirk_usb_early_handoff+0x0/0x140 took 26917 usecs Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1a.0: quirk_usb_early_handoff+0x0/0x140 took 132944 usecs Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:00:1d.0: quirk_usb_early_handoff+0x0/0x140 took 135864 usecs Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:01:00.4: quirk_usb_early_handoff+0x0/0x140 took 21747 usecs Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pci 0000:80:05.0: disabled boot interrupts on device [8086:2f28] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PCI: CLS 64 bytes, default 64 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Trying to unpack rootfs image as initramfs... Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: software IO TLB: mapped [mem 0x0000000065000000-0x0000000069000000] (64MB) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: bus type thunderbolt registered Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Initialise system trusted keyrings Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Key type blacklist registered Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: workingset: timestamp_bits=36 max_order=24 bucket_order=0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: zbud: loaded Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: integrity: Platform Keyring initialized Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NET: Registered PF_ALG protocol family Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: xor: automatically using best checksumming function avx Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Key type asymmetric registered Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Asymmetric key parser 'x509' registered Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Running certificate verification selftests Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db' Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: io scheduler mq-deadline registered Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: io scheduler kyber registered Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: io scheduler bfq registered Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 25 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 26 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 28 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 29 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 30 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 31 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 33 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 34 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 35 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 36 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 37 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:00:1c.2: PME: Signaling with IRQ 38 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:00:1c.4: PME: Signaling with IRQ 39 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:00:1c.6: PME: Signaling with IRQ 40 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:00:1c.7: PME: Signaling with IRQ 41 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:80:00.0: PME: Signaling with IRQ 42 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:80:01.0: PME: Signaling with IRQ 44 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:80:01.1: PME: Signaling with IRQ 45 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:80:02.0: PME: Signaling with IRQ 47 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:80:02.1: PME: Signaling with IRQ 48 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:80:02.2: PME: Signaling with IRQ 49 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:80:02.3: PME: Signaling with IRQ 50 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:80:03.0: PME: Signaling with IRQ 52 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tsc: Refined TSC clocksource calibration: 2297.340 MHz Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x211d64549a7, max_idle_ns: 440795285220 ns Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: clocksource: Switched to clocksource tsc Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:80:03.1: PME: Signaling with IRQ 53 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:80:03.2: PME: Signaling with IRQ 54 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pcieport 0000:80:03.3: PME: Signaling with IRQ 55 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C000: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C002: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C004: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C006: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C008: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C00A: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C00C: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C00E: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C010: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C012: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C014: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C016: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C018: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C01A: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C01C: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C01E: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C020: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C022: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C024: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C026: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C028: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C02A: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C02C: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C02E: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C030: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C032: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C034: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C036: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C038: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C03A: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C03C: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C03E: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C040: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C042: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C044: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C046: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C001: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C003: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C005: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C007: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C009: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C00B: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C00D: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C00F: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C011: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C013: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C015: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C017: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C019: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C01B: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C01D: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C01F: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C021: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK0.C023: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C025: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C027: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C029: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C02B: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C02D: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C02F: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C031: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C033: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C035: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C037: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C039: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C03B: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C03D: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C03F: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C041: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C043: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C045: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: \_SB_.SCK1.C047: Found 2 idle states Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Freeing initrd memory: 33552K Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: button: Power Button [PWRF] Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pstore: Registered erst as persistent store backend Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Non-volatile memory driver v1.3 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: rdac: device handler registered Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hp_sw: device handler registered Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: emc: device handler registered Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: alua: device handler registered Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: libphy: Fixed MDIO Bus: probed Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ehci-pci: EHCI PCI platform driver Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ehci-pci 0000:00:1a.0: EHCI Host Controller Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ehci-pci 0000:00:1a.0: new USB bus registered, assigned bus number 1 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ehci-pci 0000:00:1a.0: debug port 2 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ehci-pci 0000:00:1a.0: irq 18, io mem 0x92e01000 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ehci-pci 0000:00:1a.0: USB 2.0 started, EHCI 1.00 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0002, bcdDevice= 5.14 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb1: Product: EHCI Host Controller Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb1: Manufacturer: Linux 5.14.0-246.rt14.245.1955_759844798.el9.x86_64+debug ehci_hcd Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb1: SerialNumber: 0000:00:1a.0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hub 1-0:1.0: USB hub found Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hub 1-0:1.0: 2 ports detected Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ehci-pci 0000:00:1d.0: EHCI Host Controller Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ehci-pci 0000:00:1d.0: new USB bus registered, assigned bus number 2 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ehci-pci 0000:00:1d.0: debug port 2 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ehci-pci 0000:00:1d.0: irq 18, io mem 0x92e00000 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ehci-pci 0000:00:1d.0: USB 2.0 started, EHCI 1.00 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb2: New USB device found, idVendor=1d6b, idProduct=0002, bcdDevice= 5.14 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb2: New USB device strings: Mfr=3, Product=2, SerialNumber=1 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb2: Product: EHCI Host Controller Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb2: Manufacturer: Linux 5.14.0-246.rt14.245.1955_759844798.el9.x86_64+debug ehci_hcd Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb2: SerialNumber: 0000:00:1d.0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hub 2-0:1.0: USB hub found Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hub 2-0:1.0: 2 ports detected Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ohci-pci: OHCI PCI platform driver Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: uhci_hcd: USB Universal Host Controller Interface driver Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: uhci_hcd 0000:01:00.4: UHCI Host Controller Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: uhci_hcd 0000:01:00.4: new USB bus registered, assigned bus number 3 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: uhci_hcd 0000:01:00.4: detected 8 ports Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: uhci_hcd 0000:01:00.4: port count misdetected? forcing to 2 ports Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: uhci_hcd 0000:01:00.4: irq 56, io port 0x00001300 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb3: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb3: New USB device strings: Mfr=3, Product=2, SerialNumber=1 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb3: Product: UHCI Host Controller Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb3: Manufacturer: Linux 5.14.0-246.rt14.245.1955_759844798.el9.x86_64+debug uhci_hcd Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb3: SerialNumber: 0000:01:00.4 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hub 3-0:1.0: USB hub found Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hub 3-0:1.0: 2 ports detected Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 4 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x100 quirks 0x0000000000009810 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 5 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.0 SuperSpeed Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb4: New USB device found, idVendor=1d6b, idProduct=0002, bcdDevice= 5.14 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb4: New USB device strings: Mfr=3, Product=2, SerialNumber=1 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb4: Product: xHCI Host Controller Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb4: Manufacturer: Linux 5.14.0-246.rt14.245.1955_759844798.el9.x86_64+debug xhci-hcd Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb4: SerialNumber: 0000:00:14.0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hub 4-0:1.0: USB hub found Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hub 4-0:1.0: 15 ports detected Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb5: New USB device found, idVendor=1d6b, idProduct=0003, bcdDevice= 5.14 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb5: New USB device strings: Mfr=3, Product=2, SerialNumber=1 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb5: Product: xHCI Host Controller Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb5: Manufacturer: Linux 5.14.0-246.rt14.245.1955_759844798.el9.x86_64+debug xhci-hcd Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb usb5: SerialNumber: 0000:00:14.0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hub 5-0:1.0: USB hub found Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hub 5-0:1.0: 6 ports detected Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usbcore: registered new interface driver usbserial_generic Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usbserial: USB Serial support registered for generic Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: i8042: PNP: No PS/2 controller found. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: i8042: Probing ports directly. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: mousedev: PS/2 mouse device common for all mice Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: rtc_cmos 00:00: RTC can wake from S4 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: rtc_cmos 00:00: registered as rtc0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: rtc_cmos 00:00: setting system clock to 2023-01-27T16:11:27 UTC (1674835887) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: rtc_cmos 00:00: alarms up to one month, y3k, 114 bytes nvram, hpet irqs Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb 1-1: new high-speed USB device number 2 using ehci-pci Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb 2-1: new high-speed USB device number 2 using ehci-pci Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: intel_pstate: Intel P-state driver initializing Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb 4-3: new high-speed USB device number 2 using xhci_hcd Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb 1-1: New USB device found, idVendor=8087, idProduct=800a, bcdDevice= 0.05 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb 1-1: New USB device strings: Mfr=0, Product=0, SerialNumber=0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hub 1-1:1.0: USB hub found Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hub 1-1:1.0: 6 ports detected Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb 2-1: New USB device found, idVendor=8087, idProduct=8002, bcdDevice= 0.05 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb 2-1: New USB device strings: Mfr=0, Product=0, SerialNumber=0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hub 2-1:1.0: USB hub found Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hub 2-1:1.0: 8 ports detected Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hid: raw HID events driver (C) Jiri Kosina Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usbcore: registered new interface driver usbhid Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usbhid: USB HID core driver Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: drop_monitor: Initializing network drop monitor service Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Initializing XFRM netlink socket Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NET: Registered PF_INET6 protocol family Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Segment Routing with IPv6 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: NET: Registered PF_PACKET protocol family Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: mpls_gso: MPLS GSO support Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb 4-3: New USB device found, idVendor=0424, idProduct=2660, bcdDevice= 8.01 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: usb 4-3: New USB device strings: Mfr=0, Product=0, SerialNumber=0 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hub 4-3:1.0: USB hub found Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hub 4-3:1.0: 2 ports detected Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: microcode: sig=0x306f2, pf=0x1, revision=0x49 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: microcode: Microcode Update Driver: v2.2. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: resctrl: L3 monitoring detected Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: IPI shorthand broadcast: enabled Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: AVX2 version of gcm_enc/dec engaged. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: AES CTR mode by8 optimization enabled Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: sched_clock: Marking stable (9500377347, -7916626)->(10443367015, -950906294) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: registered taskstats version 1 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Loading compiled-in X.509 certificates Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 79bcee618ae32dcaaa7df5edf9ea9a9b80b47e61' Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80' Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8' Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: zswap: loaded using pool lzo/zbud Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: debug_vm_pgtable: [debug_vm_pgtable ]: Validating architecture page table helpers Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: page_owner is disabled Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: pstore: Using crash dump compression: deflate Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Key type big_key registered Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Key type encrypted registered Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ima: No TPM chip found, activating TPM-bypass! Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Loading compiled-in module X.509 certificates Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 79bcee618ae32dcaaa7df5edf9ea9a9b80b47e61' Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ima: Allocated hash algorithm: sha256 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ima: No architecture policies found Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: evm: Initialising EVM extended attributes: Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: evm: security.selinux Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: evm: security.SMACK64 (disabled) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: evm: security.SMACK64EXEC (disabled) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: evm: security.SMACK64TRANSMUTE (disabled) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: evm: security.SMACK64MMAP (disabled) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: evm: security.apparmor (disabled) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: evm: security.ima Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: evm: security.capability Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: evm: HMAC attrs: 0x1 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: cryptomgr_test (901) used greatest stack depth: 13232 bytes left Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: modprobe (905) used greatest stack depth: 12976 bytes left Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: cryptomgr_test (956) used greatest stack depth: 12816 bytes left Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PM: Magic number: 7:312:187 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: acpi device:46: hash matches Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Freeing unused decrypted memory: 2036K Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Freeing unused kernel image (initmem) memory: 4644K Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Write protecting the kernel read-only data: 30720k Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Freeing unused kernel image (rodata/data gap) memory: 1760K Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: x86/mm: Checking user space page tables Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Run /init as init process Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: with arguments: Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: /init Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: with environment: Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: HOME=/ Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: TERM=linux Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: BOOT_IMAGE=(hd0,msdos1)/vmlinuz-5.14.0-246.rt14.245.1955_759844798.el9.x86_64+debug Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd 252-3.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Detected architecture x86-64. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Running in initrd. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Hostname set to . Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: dracut-rootfs-g (969) used greatest stack depth: 12736 bytes left Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Queued start job for default target Initrd Default Target. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Created slice Slice /system/systemd-hibernate-resume. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Initrd /usr File System. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Path Units. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Slice Units. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Swaps. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Timer Units. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on D-Bus System Message Bus Socket. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on Journal Socket (/dev/log). Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on Journal Socket. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on udev Control Socket. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on udev Kernel Socket. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Socket Units. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Create List of Static Device Nodes... Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Journal Service... Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Apply Kernel Variables... Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Create System Users... Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Setup Virtual Console... Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Create List of Static Device Nodes. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Apply Kernel Variables. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Create System Users. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Create Static Device Nodes in /dev... Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Create Static Device Nodes in /dev. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Setup Virtual Console. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting dracut cmdline hook... Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-journald[1006]: Journal started Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-journald[1006]: Runtime Journal (/run/log/journal/662a22492a89415692f80bf0ab673b7b) is 8.0M, max 913.0M, 905.0M free. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-sysusers[1010]: Creating group 'nobody' with GID 65534. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-sysusers[1010]: Creating group 'users' with GID 100. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Journal Service. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-sysusers[1010]: Creating group 'dbus' with GID 81. Fri 2023-01-27 11:11:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-sysusers[1010]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81. Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com dracut-cmdline[1025]: dracut-9 dracut-057-20.git20221213.el9 Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com dracut-cmdline[1025]: Using kernel command line parameters: BOOT_IMAGE=(hd0,msdos1)/vmlinuz-5.14.0-246.rt14.245.1955_759844798.el9.x86_64+debug root=/dev/mapper/cs_hpe--ml350gen9--01-root ro efi=runtime resume=/dev/mapper/cs_hpe--ml350gen9--01-swap rd.lvm.lv=cs_hpe-ml350gen Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com dracut-cmdline[1025]: 9-01/root rd.lvm.lv=cs_hpe-ml350gen9-01/swap console=ttyS1,115200n81 crashkernel=1G-2G:384M,2G-3G:512M,3G-4G:768M,4G-16G:1G,16G-64G:2G,64G-128G:2G,128G-:4G Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Create Volatile Files and Directories... Fri 2023-01-27 11:11:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Create Volatile Files and Directories. Fri 2023-01-27 11:11:44 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished dracut cmdline hook. Fri 2023-01-27 11:11:44 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting dracut pre-udev hook... Fri 2023-01-27 11:11:44 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Fri 2023-01-27 11:11:44 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: device-mapper: uevent: version 1.0.3 Fri 2023-01-27 11:11:44 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com Fri 2023-01-27 11:11:44 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished dracut pre-udev hook. Fri 2023-01-27 11:11:44 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Rule-based Manager for Device Events and Files... Fri 2023-01-27 11:11:45 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-udevd[1146]: Using default interface naming scheme 'rhel-9.0'. Fri 2023-01-27 11:11:45 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Rule-based Manager for Device Events and Files. Fri 2023-01-27 11:11:45 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dracut pre-trigger hook was skipped because no trigger condition checks were met. Fri 2023-01-27 11:11:45 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Coldplug All udev Devices... Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Coldplug All udev Devices. Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Network. Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting dracut initqueue hook... Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hpwdt 0000:01:00.0: HPE Watchdog Timer Driver: NMI decoding initialized Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hpwdt 0000:01:00.0: HPE Watchdog Timer Driver: Version: 2.0.4 Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hpwdt 0000:01:00.0: timeout: 30 seconds (nowayout=0) Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hpwdt 0000:01:00.0: pretimeout: on. Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hpwdt 0000:01:00.0: kdumptimeout: -1. Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Warning: Unmaintained hardware is detected: hpsa:3239:103C @ 0000:03:00.0 Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: HP HPSA Driver (v 3.4.20-200) Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hpsa 0000:03:00.0: can't disable ASPM; OS doesn't have ASPM control Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.0 eth0: Tigon3 [partno(N/A) rev 5719001] (PCI Express) MAC address 9c:b6:54:af:de:74 Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.0 eth0: attached PHY is 5719C (10/100/1000Base-T Ethernet) (WireSpeed[1], EEE[1]) Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.0 eth0: RXcsums[1] LinkChgREG[0] MIirq[0] ASF[1] TSOcap[1] Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.0 eth0: dma_rwctrl[00000001] dma_mask[64-bit] Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hpsa 0000:03:00.0: Logical aborts not supported Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.1 eth1: Tigon3 [partno(N/A) rev 5719001] (PCI Express) MAC address 9c:b6:54:af:de:75 Fri 2023-01-27 11:11:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.1 eth1: attached PHY is 5719C (10/100/1000Base-T Ethernet) (WireSpeed[1], EEE[1]) Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.1 eth1: RXcsums[1] LinkChgREG[0] MIirq[0] ASF[1] TSOcap[1] Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.1 eth1: dma_rwctrl[00000001] dma_mask[64-bit] Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.2 eth2: Tigon3 [partno(N/A) rev 5719001] (PCI Express) MAC address 9c:b6:54:af:de:76 Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.2 eth2: attached PHY is 5719C (10/100/1000Base-T Ethernet) (WireSpeed[1], EEE[1]) Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.2 eth2: RXcsums[1] LinkChgREG[0] MIirq[0] ASF[1] TSOcap[1] Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.2 eth2: dma_rwctrl[00000001] dma_mask[64-bit] Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: scsi host0: hpsa Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.3 eth3: Tigon3 [partno(N/A) rev 5719001] (PCI Express) MAC address 9c:b6:54:af:de:77 Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.3 eth3: attached PHY is 5719C (10/100/1000Base-T Ethernet) (WireSpeed[1], EEE[1]) Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.3 eth3: RXcsums[1] LinkChgREG[0] MIirq[0] ASF[1] TSOcap[1] Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.3 eth3: dma_rwctrl[00000001] dma_mask[64-bit] Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hpsa can't handle SMP requests Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hpsa 0000:03:00.0: scsi 0:0:0:0: added RAID HP P440ar controller SSDSmartPathCap- En- Exp=1 Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hpsa 0000:03:00.0: scsi 0:0:1:0: masked Direct-Access HP EG0300FCSPH PHYS DRV SSDSmartPathCap- En- Exp=0 Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hpsa 0000:03:00.0: scsi 0:0:2:0: masked Direct-Access HP EG0300FCSPH PHYS DRV SSDSmartPathCap- En- Exp=0 Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hpsa 0000:03:00.0: scsi 0:1:0:0: added Direct-Access HP LOGICAL VOLUME RAID-0 SSDSmartPathCap- En- Exp=1 Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: hpsa can't handle SMP requests Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: scsi 0:0:0:0: RAID HP P440ar 7.00 PQ: 0 ANSI: 5 Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: scsi 0:1:0:0: Direct-Access HP LOGICAL VOLUME 7.00 PQ: 0 ANSI: 5 Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.0 eno1: renamed from eth0 Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.2 eno3: renamed from eth2 Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 12 Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: scsi 0:1:0:0: Attached scsi generic sg1 type 0 Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.1 eno2: renamed from eth1 Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.3 eno4: renamed from eth3 Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: sd 0:1:0:0: [sda] 1171743324 512-byte logical blocks: (600 GB/559 GiB) Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: sd 0:1:0:0: [sda] Write Protect is off Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: sd 0:1:0:0: [sda] Mode Sense: 73 00 00 08 Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: sd 0:1:0:0: [sda] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: sd 0:1:0:0: [sda] Preferred minimum I/O size 262144 bytes Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: sd 0:1:0:0: [sda] Optimal transfer size 524288 bytes Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: sda: sda1 sda2 Fri 2023-01-27 11:11:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: sd 0:1:0:0: [sda] Attached SCSI disk Fri 2023-01-27 11:11:51 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com dracut-initqueue[1460]: Scanning devices sda2 for LVM logical volumes cs_hpe-ml350gen9-01/root Fri 2023-01-27 11:11:51 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com dracut-initqueue[1460]: cs_hpe-ml350gen9-01/swap Fri 2023-01-27 11:11:51 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com dracut-initqueue[1460]: cs_hpe-ml350gen9-01/root linear Fri 2023-01-27 11:11:51 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com dracut-initqueue[1460]: cs_hpe-ml350gen9-01/swap linear Fri 2023-01-27 11:11:51 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Found device /dev/mapper/cs_hpe--ml350gen9--01-root. Fri 2023-01-27 11:11:51 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Initrd Root Device. Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Found device /dev/mapper/cs_hpe--ml350gen9--01-swap. Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Resume from hibernation using device /dev/mapper/cs_hpe--ml350gen9--01-swap... Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-hibernate-resume[1501]: Could not resume from '/dev/mapper/cs_hpe--ml350gen9--01-swap' (253:1). Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PM: Image not found (code -22) Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-hibernate-resume@dev-mapper-cs_hpe\x2d\x2dml350gen9\x2d\x2d01\x2dswap.service: Deactivated successfully. Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Resume from hibernation using device /dev/mapper/cs_hpe--ml350gen9--01-swap. Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Preparation for Local File Systems. Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Local File Systems. Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target System Initialization. Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Basic System. Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished dracut initqueue hook. Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Preparation for Remote File Systems. Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Remote File Systems. Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dracut pre-mount hook was skipped because no trigger condition checks were met. Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting File System Check on /dev/mapper/cs_hpe--ml350gen9--01-root... Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-fsck[1510]: /usr/sbin/fsck.xfs: XFS file system. Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: fsck (1509) used greatest stack depth: 11984 bytes left Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished File System Check on /dev/mapper/cs_hpe--ml350gen9--01-root. Fri 2023-01-27 11:11:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounting /sysroot... Fri 2023-01-27 11:11:53 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SGI XFS with ACLs, security attributes, scrub, verbose warnings, quota, no debug enabled Fri 2023-01-27 11:11:53 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: XFS (dm-0): Mounting V5 Filesystem Fri 2023-01-27 11:11:53 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: XFS (dm-0): Ending clean mount Fri 2023-01-27 11:11:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounted /sysroot. Fri 2023-01-27 11:11:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Initrd Root File System. Fri 2023-01-27 11:11:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Mountpoints Configured in the Real Root... Fri 2023-01-27 11:11:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: systemd-fstab-g (1524) used greatest stack depth: 11872 bytes left Fri 2023-01-27 11:11:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: initrd-parse-etc.service: Deactivated successfully. Fri 2023-01-27 11:11:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Mountpoints Configured in the Real Root. Fri 2023-01-27 11:11:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Initrd File Systems. Fri 2023-01-27 11:11:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Initrd Default Target. Fri 2023-01-27 11:11:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dracut mount hook was skipped because no trigger condition checks were met. Fri 2023-01-27 11:11:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting dracut pre-pivot and cleanup hook... Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished dracut pre-pivot and cleanup hook. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Cleaning Up and Shutting Down Daemons... Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Network. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Timer Units. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dbus.socket: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Closed D-Bus System Message Bus Socket. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped dracut pre-pivot and cleanup hook. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Initrd Default Target. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Basic System. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Initrd Root Device. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Initrd /usr File System. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Path Units. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Remote File Systems. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Preparation for Remote File Systems. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Slice Units. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Socket Units. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target System Initialization. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Local File Systems. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Preparation for Local File Systems. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Swaps. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dracut-initqueue.service: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped dracut initqueue hook. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dracut-initqueue.service: Consumed 1.826s CPU time. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-sysctl.service: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Apply Kernel Variables. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Create Volatile Files and Directories. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Coldplug All udev Devices. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-udev-trigger.service: Consumed 2.814s CPU time. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopping Rule-based Manager for Device Events and Files... Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Setup Virtual Console. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: initrd-cleanup.service: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Cleaning Up and Shutting Down Daemons. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-udevd.service: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Rule-based Manager for Device Events and Files. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-udevd.service: Consumed 25.332s CPU time. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Closed udev Control Socket. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Closed udev Kernel Socket. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dracut-pre-udev.service: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped dracut pre-udev hook. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dracut-cmdline.service: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped dracut cmdline hook. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: dracut-cmdline.service: Consumed 1.464s CPU time. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Cleanup udev Database... Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Create Static Device Nodes in /dev. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: kmod-static-nodes.service: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Create List of Static Device Nodes. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-sysusers.service: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Create System Users. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Cleanup udev Database. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Switch Root. Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Switch Root... Fri 2023-01-27 11:11:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Switching root. Fri 2023-01-27 11:11:56 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-journald[1006]: Journal stopped Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-journald[1006]: Received SIGTERM from PID 1 (systemd). Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SELinux: policy capability network_peer_controls=1 Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SELinux: policy capability open_perms=1 Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SELinux: policy capability extended_socket_class=1 Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SELinux: policy capability always_check_network=0 Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SELinux: policy capability cgroup_seclabel=1 Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SELinux: policy capability nnp_nosuid_transition=1 Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: audit: type=1403 audit(1674835918.354:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Successfully loaded SELinux policy in 1.417046s. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: RTC configured in localtime, applying delta of -300 minutes to system time. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 315.787ms. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd 252-3.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Detected architecture x86-64. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-rc-local-generator[1576]: /etc/rc.d/rc.local is not marked executable, skipping. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: systemd-rc-loca (1576) used greatest stack depth: 11792 bytes left Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: grep (1584) used greatest stack depth: 11216 bytes left Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: kdump-dep-gener (1556) used greatest stack depth: 11104 bytes left Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: /usr/lib/systemd/system/restraintd.service:8: Standard output type syslog+console is obsolete, automatically updating to journal+console. Please update your unit file, and consider removing the setting altogether. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: initrd-switch-root.service: Deactivated successfully. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Switch Root. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Created slice Slice /system/getty. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Created slice Slice /system/modprobe. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Created slice Slice /system/serial-getty. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Created slice Slice /system/sshd-keygen. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Created slice User and Session Slice. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Forward Password Requests to Wall Directory Watch. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Local Encrypted Volumes. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Switch Root. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Initrd File Systems. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped target Initrd Root File System. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Local Integrity Protected Volumes. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Slice Units. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target System Time Set. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Local Verity Protected Volumes. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on Device-mapper event daemon FIFOs. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on LVM2 poll daemon socket. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on RPCbind Server Activation Socket. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target RPC Port Mapper. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on Process Core Dump Socket. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on initctl Compatibility Named Pipe. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on udev Control Socket. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on udev Kernel Socket. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Activating swap /dev/mapper/cs_hpe--ml350gen9--01-swap... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounting Huge Pages File System... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounting POSIX Message Queue File System... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Adding 24711164k swap on /dev/mapper/cs_hpe--ml350gen9--01-swap. Priority:-2 extents:1 across:24711164k FS Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounting Kernel Debug File System... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounting Kernel Trace File System... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Create List of Static Device Nodes... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Load Kernel Module configfs... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Load Kernel Module drm... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Load Kernel Module fuse... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-fsck-root.service: Deactivated successfully. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped File System Check on Root Device. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Stopped Journal Service. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-journald.service: Consumed 1.539s CPU time. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: fuse: init (API version 7.36) Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Journal Service... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Generate network units from Kernel command line... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ACPI: bus type drm_connector registered Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Remount Root and Kernel File Systems... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Apply Kernel Variables... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Coldplug All udev Devices... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-journald[1605]: Journal started Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-journald[1605]: Runtime Journal (/run/log/journal/662a22492a89415692f80bf0ab673b7b) is 8.0M, max 913.0M, 905.0M free. Fri 2023-01-27 16:12:00 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Queued start job for default target Multi-User System. Fri 2023-01-27 16:12:00 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-journald.service: Deactivated successfully. Fri 2023-01-27 16:12:00 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-journald.service: Consumed 1.539s CPU time. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com lvm[1600]: 2 logical volume(s) in volume group "cs_hpe-ml350gen9-01" monitored Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Activated swap /dev/mapper/cs_hpe--ml350gen9--01-swap. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Journal Service. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounted Huge Pages File System. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounted POSIX Message Queue File System. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounted Kernel Debug File System. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounted Kernel Trace File System. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Create List of Static Device Nodes. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: modprobe@configfs.service: Deactivated successfully. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Load Kernel Module configfs. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: modprobe@drm.service: Deactivated successfully. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Load Kernel Module drm. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: modprobe@fuse.service: Deactivated successfully. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Load Kernel Module fuse. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Generate network units from Kernel command line. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Remount Root and Kernel File Systems. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Apply Kernel Variables. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Preparation for Network. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Swaps. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounting FUSE Control File System... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounting Kernel Configuration File System... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Special handling of early boot iSCSI sessions was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/iscsi_session). Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes). Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Rebuild Hardware Database was skipped because of an unmet condition check (ConditionNeedsUpdate=/etc). Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Flush Journal to Persistent Storage... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Load/Save Random Seed... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Create System Users was skipped because no trigger condition checks were met. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Create Static Device Nodes in /dev... Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-journald[1605]: Time spent on flushing to /var/log/journal/662a22492a89415692f80bf0ab673b7b is 448.645ms for 1795 entries. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-journald[1605]: System Journal (/var/log/journal/662a22492a89415692f80bf0ab673b7b) is 24.0M, max 4.0G, 3.9G free. Fri 2023-01-27 16:12:02 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-journald[1605]: Received client request to flush runtime journal. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounted FUSE Control File System. Fri 2023-01-27 16:12:01 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounted Kernel Configuration File System. Fri 2023-01-27 16:12:02 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Load/Save Random Seed. Fri 2023-01-27 16:12:02 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes). Fri 2023-01-27 16:12:02 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Create Static Device Nodes in /dev. Fri 2023-01-27 16:12:02 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Preparation for Local File Systems. Fri 2023-01-27 16:12:02 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Rule-based Manager for Device Events and Files... Fri 2023-01-27 16:12:02 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-udevd[1625]: Using default interface naming scheme 'rhel-9.0'. Fri 2023-01-27 16:12:02 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Flush Journal to Persistent Storage. Fri 2023-01-27 16:12:02 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Rule-based Manager for Device Events and Files. Fri 2023-01-27 16:12:03 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Load Kernel Module configfs... Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: modprobe@configfs.service: Deactivated successfully. Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Load Kernel Module configfs. Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Condition check resulted in /dev/ttyS1 being skipped. Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Coldplug All udev Devices. Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: power_meter ACPI000D:00: Found ACPI power meter. Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: acpi-tad ACPI000E:00: Unsupported capabilities Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: power_meter ACPI000D:00: Ignoring unsafe software power cap! Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: power_meter ACPI000D:00: hwmon_device_register() is deprecated. Please convert the driver to use hwmon_device_register_with_info(). Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: IPMI message handler: version 39.2 Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ipmi device interface Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ipmi_si: IPMI System Interface driver Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2-0x0ca3] regsize 1 spacing 1 irq 0 Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-udevd[1716]: could not read from '/sys/module/acpi_cpufreq/initstate': No such device Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ipmi_si: Adding ACPI-specified kcs state machine Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Fri 2023-01-27 16:12:04 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x00000b, prod_id: 0x2000, dev_id: 0x13) Fri 2023-01-27 16:12:05 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Fri 2023-01-27 16:12:05 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Condition check resulted in LOGICAL_VOLUME 1 being skipped. Fri 2023-01-27 16:12:05 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com lvm[1785]: PV /dev/sda2 online, VG cs_hpe-ml350gen9-01 is complete. Fri 2023-01-27 16:12:05 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounting /boot... Fri 2023-01-27 16:12:05 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: XFS (sda1): Mounting V5 Filesystem Fri 2023-01-27 16:12:05 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: dca service started, version 1.12.1 Fri 2023-01-27 16:12:05 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: input: PC Speaker as /devices/platform/pcspkr/input/input9 Fri 2023-01-27 16:12:05 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event cs_hpe-ml350gen9-01. Fri 2023-01-27 16:12:05 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ipmi_ssif: IPMI SSIF Interface driver Fri 2023-01-27 16:12:05 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: XFS (sda1): Ending clean mount Fri 2023-01-27 16:12:05 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounted /boot. Fri 2023-01-27 16:12:05 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Fri 2023-01-27 16:12:05 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: i2c i2c-0: 6/24 memory slots populated (from DMI) Fri 2023-01-27 16:12:05 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: i2c i2c-0: Systems with more than 4 memory slots not supported yet, not instantiating SPD Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: ioat_enumerate_channels: xfercap = 1048576 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: __ioat_start_null_desc: head: 0x0 tail: 0x0 issued: 0x0 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: desc[0]: (0x118e00000->0x118e00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: __ioat_issue_pending: head: 0x1 tail: 0x0 issued: 0x1 count: 0x1 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: ioat_check_space_lock: num_descs: 1 (1:0:1) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: desc[1]: (0x118e00040->0x118e00080) cookie: 0 flags: 0x0 ctl: 0x00000000 (op: 0x0 int_en: 0 compl: 0) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: desc[1]: (0x118e00040->0x118e00080) cookie: 0 flags: 0x1 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: ioat_tx_submit_unlock: cookie: 2 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: __ioat_issue_pending: head: 0x2 tail: 0x0 issued: 0x2 count: 0x2 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: ioat_get_current_completion: phys_complete: 0x118e00040 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: __cleanup: head: 0x2 tail: 0x0 issued: 0x2 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: desc[0]: (0x118e00000->0x118e00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: desc[1]: (0x118e00040->0x118e00080) cookie: 2 flags: 0x3 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: ioat_get_current_completion: phys_complete: 0x118e00040 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: ioat_get_current_completion: phys_complete: 0x118e00040 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: freeing 65536 idle descriptors Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.0: ioat_xor_val_self_test Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.1: ioat_enumerate_channels: xfercap = 1048576 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.1: __ioat_start_null_desc: head: 0x0 tail: 0x0 issued: 0x0 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.1: desc[0]: (0x123f00000->0x123f00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.1: __ioat_issue_pending: head: 0x1 tail: 0x0 issued: 0x1 count: 0x1 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.1: ioat_check_space_lock: num_descs: 1 (1:0:1) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.1: desc[1]: (0x123f00040->0x123f00080) cookie: 0 flags: 0x0 ctl: 0x00000000 (op: 0x0 int_en: 0 compl: 0) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.1: desc[1]: (0x123f00040->0x123f00080) cookie: 0 flags: 0x1 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.1: ioat_tx_submit_unlock: cookie: 2 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.1: __ioat_issue_pending: head: 0x2 tail: 0x0 issued: 0x2 count: 0x2 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.1: ioat_get_current_completion: phys_complete: 0x123f00040 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.1: __cleanup: head: 0x2 tail: 0x0 issued: 0x2 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.1: desc[0]: (0x123f00000->0x123f00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.1: desc[1]: (0x123f00040->0x123f00080) cookie: 2 flags: 0x3 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.1: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.1: ioat_get_current_completion: phys_complete: 0x123f00040 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.1: freeing 65536 idle descriptors Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.1: ioat_xor_val_self_test Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: ioat_enumerate_channels: xfercap = 1048576 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: __ioat_start_null_desc: head: 0x0 tail: 0x0 issued: 0x0 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: desc[0]: (0x10b880000->0x10b880040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: __ioat_issue_pending: head: 0x1 tail: 0x0 issued: 0x1 count: 0x1 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: ioat_get_current_completion: phys_complete: 0x10b880000 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: __cleanup: head: 0x1 tail: 0x0 issued: 0x1 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: ioat_check_space_lock: num_descs: 1 (1:0:1) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: desc[1]: (0x10b880040->0x10b880080) cookie: 0 flags: 0x0 ctl: 0x00000000 (op: 0x0 int_en: 0 compl: 0) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: desc[0]: (0x10b880000->0x10b880040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: desc[1]: (0x10b880040->0x10b880080) cookie: 0 flags: 0x1 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: ioat_tx_submit_unlock: cookie: 2 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: __ioat_issue_pending: head: 0x2 tail: 0x1 issued: 0x2 count: 0x2 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: ioat_get_current_completion: phys_complete: 0x10b880040 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: __cleanup: head: 0x2 tail: 0x1 issued: 0x2 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: desc[1]: (0x10b880040->0x10b880080) cookie: 2 flags: 0x3 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: ioat_get_current_completion: phys_complete: 0x10b880040 Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: freeing 65536 idle descriptors Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.2: ioat_xor_val_self_test Fri 2023-01-27 16:12:06 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: ioat_enumerate_channels: xfercap = 1048576 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: __ioat_start_null_desc: head: 0x0 tail: 0x0 issued: 0x0 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: desc[0]: (0x123f00000->0x123f00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: __ioat_issue_pending: head: 0x1 tail: 0x0 issued: 0x1 count: 0x1 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: ioat_get_current_completion: phys_complete: 0x123f00000 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: __cleanup: head: 0x1 tail: 0x0 issued: 0x1 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: desc[0]: (0x123f00000->0x123f00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: ioat_check_space_lock: num_descs: 1 (1:1:1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: desc[1]: (0x123f00040->0x123f00080) cookie: 0 flags: 0x0 ctl: 0x00000000 (op: 0x0 int_en: 0 compl: 0) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: desc[1]: (0x123f00040->0x123f00080) cookie: 0 flags: 0x1 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: ioat_tx_submit_unlock: cookie: 2 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: __ioat_issue_pending: head: 0x2 tail: 0x1 issued: 0x2 count: 0x2 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: ioat_get_current_completion: phys_complete: 0x123f00040 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: __cleanup: head: 0x2 tail: 0x1 issued: 0x2 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: desc[1]: (0x123f00040->0x123f00080) cookie: 2 flags: 0x3 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: ioat_get_current_completion: phys_complete: 0x123f00040 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: freeing 65536 idle descriptors Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.3: ioat_xor_val_self_test Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.4: ioat_enumerate_channels: xfercap = 1048576 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: RAPL PMU: API unit is 2^-32 Joules, 2 fixed counters, 655360 ms ovfl timer Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: RAPL PMU: hw unit of domain dram 2^-16 Joules Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: iTCO_vendor_support: vendor-support=0 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.4: __ioat_start_null_desc: head: 0x0 tail: 0x0 issued: 0x0 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.4: desc[0]: (0x10b880000->0x10b880040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.4: __ioat_issue_pending: head: 0x1 tail: 0x0 issued: 0x1 count: 0x1 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.4: ioat_check_space_lock: num_descs: 1 (1:0:1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.4: desc[1]: (0x10b880040->0x10b880080) cookie: 0 flags: 0x0 ctl: 0x00000000 (op: 0x0 int_en: 0 compl: 0) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.4: desc[1]: (0x10b880040->0x10b880080) cookie: 0 flags: 0x1 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.4: ioat_tx_submit_unlock: cookie: 2 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.4: __ioat_issue_pending: head: 0x2 tail: 0x0 issued: 0x2 count: 0x2 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.4: ioat_get_current_completion: phys_complete: 0x10b880040 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.4: __cleanup: head: 0x2 tail: 0x0 issued: 0x2 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.4: desc[0]: (0x10b880000->0x10b880040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.4: desc[1]: (0x10b880040->0x10b880080) cookie: 2 flags: 0x3 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.4: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.4: ioat_get_current_completion: phys_complete: 0x10b880040 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.4: freeing 65536 idle descriptors Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.4: ioat_xor_val_self_test Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: ioat_enumerate_channels: xfercap = 1048576 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: __ioat_start_null_desc: head: 0x0 tail: 0x0 issued: 0x0 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: desc[0]: (0x123f00000->0x123f00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: __ioat_issue_pending: head: 0x1 tail: 0x0 issued: 0x1 count: 0x1 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: ioat_check_space_lock: num_descs: 1 (1:0:1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: desc[1]: (0x123f00040->0x123f00080) cookie: 0 flags: 0x0 ctl: 0x00000000 (op: 0x0 int_en: 0 compl: 0) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: desc[1]: (0x123f00040->0x123f00080) cookie: 0 flags: 0x1 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: ioat_tx_submit_unlock: cookie: 2 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: __ioat_issue_pending: head: 0x2 tail: 0x0 issued: 0x2 count: 0x2 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: ioat_get_current_completion: phys_complete: 0x123f00000 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: __cleanup: head: 0x2 tail: 0x0 issued: 0x2 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: desc[0]: (0x123f00000->0x123f00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: ioat_get_current_completion: phys_complete: 0x123f00040 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: __cleanup: head: 0x2 tail: 0x1 issued: 0x2 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: desc[1]: (0x123f00040->0x123f00080) cookie: 2 flags: 0x3 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: ioat_get_current_completion: phys_complete: 0x123f00040 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: freeing 65536 idle descriptors Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.5: ioat_xor_val_self_test Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.6: ioat_enumerate_channels: xfercap = 1048576 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: iTCO_wdt iTCO_wdt.1.auto: unable to reset NO_REBOOT flag, device disabled by hardware/BIOS Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com lvm[1836]: 3 logical volume(s) in volume group "cs_hpe-ml350gen9-01" now active Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Found device /dev/mapper/cs_hpe--ml350gen9--01-home. Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounting /home... Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: lvm-activate-cs_hpe-ml350gen9-01.service: Deactivated successfully. Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: XFS (dm-2): Mounting V5 Filesystem Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.6: __ioat_start_null_desc: head: 0x0 tail: 0x0 issued: 0x0 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.6: desc[0]: (0x10b880000->0x10b880040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.6: __ioat_issue_pending: head: 0x1 tail: 0x0 issued: 0x1 count: 0x1 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.6: ioat_check_space_lock: num_descs: 1 (1:0:1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.6: desc[1]: (0x10b880040->0x10b880080) cookie: 0 flags: 0x0 ctl: 0x00000000 (op: 0x0 int_en: 0 compl: 0) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.6: desc[1]: (0x10b880040->0x10b880080) cookie: 0 flags: 0x1 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.6: ioat_tx_submit_unlock: cookie: 2 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.6: __ioat_issue_pending: head: 0x2 tail: 0x0 issued: 0x2 count: 0x2 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.6: ioat_get_current_completion: phys_complete: 0x10b880040 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.6: __cleanup: head: 0x2 tail: 0x0 issued: 0x2 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.6: desc[0]: (0x10b880000->0x10b880040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.6: desc[1]: (0x10b880040->0x10b880080) cookie: 2 flags: 0x3 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.6: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.6: ioat_get_current_completion: phys_complete: 0x10b880040 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.6: freeing 65536 idle descriptors Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.6: ioat_xor_val_self_test Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: ioat_enumerate_channels: xfercap = 1048576 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: __ioat_start_null_desc: head: 0x0 tail: 0x0 issued: 0x0 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: desc[0]: (0x123f00000->0x123f00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: __ioat_issue_pending: head: 0x1 tail: 0x0 issued: 0x1 count: 0x1 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: ioat_get_current_completion: phys_complete: 0x123f00000 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: __cleanup: head: 0x1 tail: 0x0 issued: 0x1 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: desc[0]: (0x123f00000->0x123f00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: ioat_check_space_lock: num_descs: 1 (1:1:1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: desc[1]: (0x123f00040->0x123f00080) cookie: 0 flags: 0x0 ctl: 0x00000000 (op: 0x0 int_en: 0 compl: 0) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: desc[1]: (0x123f00040->0x123f00080) cookie: 0 flags: 0x1 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: ioat_tx_submit_unlock: cookie: 2 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: __ioat_issue_pending: head: 0x2 tail: 0x1 issued: 0x2 count: 0x2 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: ioat_get_current_completion: phys_complete: 0x123f00040 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: __cleanup: head: 0x2 tail: 0x1 issued: 0x2 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: desc[1]: (0x123f00040->0x123f00080) cookie: 2 flags: 0x3 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: ioat_get_current_completion: phys_complete: 0x123f00040 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: freeing 65536 idle descriptors Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: XFS (dm-2): Ending clean mount Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:00:04.7: ioat_xor_val_self_test Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: ioat_enumerate_channels: xfercap = 1048576 Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounted /home. Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Local File Systems. Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Rebuild Dynamic Linker Cache was skipped because no trigger condition checks were met. Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux). Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Automatic Boot Loader Update... Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id). Fri 2023-01-27 16:12:07 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Create Volatile Files and Directories... Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com bootctl[1874]: Couldn't find EFI system partition, skipping. Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Automatic Boot Loader Update. Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: __ioat_start_null_desc: head: 0x0 tail: 0x0 issued: 0x0 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: desc[0]: (0x6b1d00000->0x6b1d00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: __ioat_issue_pending: head: 0x1 tail: 0x0 issued: 0x1 count: 0x1 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: ioat_check_space_lock: num_descs: 1 (1:0:1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 0 flags: 0x0 ctl: 0x00000000 (op: 0x0 int_en: 0 compl: 0) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 0 flags: 0x1 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: ioat_tx_submit_unlock: cookie: 2 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: ioat_get_current_completion: phys_complete: 0x6b1d00000 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: __cleanup: head: 0x2 tail: 0x0 issued: 0x1 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: __ioat_issue_pending: head: 0x2 tail: 0x0 issued: 0x2 count: 0x2 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: desc[0]: (0x6b1d00000->0x6b1d00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: __cleanup: head: 0x2 tail: 0x1 issued: 0x2 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 2 flags: 0x3 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: freeing 65536 idle descriptors Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.0: ioat_xor_val_self_test Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: ioat_enumerate_channels: xfercap = 1048576 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: __ioat_start_null_desc: head: 0x0 tail: 0x0 issued: 0x0 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: desc[0]: (0x6b1d00000->0x6b1d00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: __ioat_issue_pending: head: 0x1 tail: 0x0 issued: 0x1 count: 0x1 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: ioat_check_space_lock: num_descs: 1 (1:0:1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 0 flags: 0x0 ctl: 0x00000000 (op: 0x0 int_en: 0 compl: 0) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 0 flags: 0x1 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: ioat_tx_submit_unlock: cookie: 2 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: __ioat_issue_pending: head: 0x2 tail: 0x0 issued: 0x2 count: 0x2 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: __cleanup: head: 0x2 tail: 0x0 issued: 0x2 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: desc[0]: (0x6b1d00000->0x6b1d00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 2 flags: 0x3 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: freeing 65536 idle descriptors Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.1: ioat_xor_val_self_test Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.2: ioat_enumerate_channels: xfercap = 1048576 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.2: __ioat_start_null_desc: head: 0x0 tail: 0x0 issued: 0x0 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.2: desc[0]: (0x6b1d00000->0x6b1d00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.2: __ioat_issue_pending: head: 0x1 tail: 0x0 issued: 0x1 count: 0x1 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.2: ioat_check_space_lock: num_descs: 1 (1:0:1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.2: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 0 flags: 0x0 ctl: 0x00000000 (op: 0x0 int_en: 0 compl: 0) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.2: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 0 flags: 0x1 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.2: ioat_tx_submit_unlock: cookie: 2 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.2: __ioat_issue_pending: head: 0x2 tail: 0x0 issued: 0x2 count: 0x2 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.2: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.2: __cleanup: head: 0x2 tail: 0x0 issued: 0x2 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.2: desc[0]: (0x6b1d00000->0x6b1d00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.2: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 2 flags: 0x3 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.2: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.2: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.2: freeing 65536 idle descriptors Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.2: ioat_xor_val_self_test Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: ioat_enumerate_channels: xfercap = 1048576 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: __ioat_start_null_desc: head: 0x0 tail: 0x0 issued: 0x0 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: desc[0]: (0x6b1d00000->0x6b1d00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: __ioat_issue_pending: head: 0x1 tail: 0x0 issued: 0x1 count: 0x1 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: ioat_check_space_lock: num_descs: 1 (1:0:1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 0 flags: 0x0 ctl: 0x00000000 (op: 0x0 int_en: 0 compl: 0) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 0 flags: 0x1 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: ioat_tx_submit_unlock: cookie: 2 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: __ioat_issue_pending: head: 0x2 tail: 0x0 issued: 0x2 count: 0x2 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: __cleanup: head: 0x2 tail: 0x0 issued: 0x2 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: desc[0]: (0x6b1d00000->0x6b1d00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 2 flags: 0x3 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: freeing 65536 idle descriptors Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.3: ioat_xor_val_self_test Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.4: ioat_enumerate_channels: xfercap = 1048576 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: systemd-tmpfile (1876) used greatest stack depth: 10576 bytes left Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Create Volatile Files and Directories. Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounting RPC Pipe File System... Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Security Auditing Service... Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting RPC Bind... Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Rebuild Journal Catalog was skipped because of an unmet condition check (ConditionNeedsUpdate=/var). Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Update is Completed was skipped because no trigger condition checks were met. Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.4: __ioat_start_null_desc: head: 0x0 tail: 0x0 issued: 0x0 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.4: desc[0]: (0x6b1d00000->0x6b1d00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.4: __ioat_issue_pending: head: 0x1 tail: 0x0 issued: 0x1 count: 0x1 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.4: ioat_check_space_lock: num_descs: 1 (1:0:1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.4: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 0 flags: 0x0 ctl: 0x00000000 (op: 0x0 int_en: 0 compl: 0) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.4: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 0 flags: 0x1 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.4: ioat_tx_submit_unlock: cookie: 2 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.4: __ioat_issue_pending: head: 0x2 tail: 0x0 issued: 0x2 count: 0x2 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.4: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.4: __cleanup: head: 0x2 tail: 0x0 issued: 0x2 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.4: desc[0]: (0x6b1d00000->0x6b1d00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.4: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 2 flags: 0x3 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.4: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.4: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.4: freeing 65536 idle descriptors Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.4: ioat_xor_val_self_test Fri 2023-01-27 16:12:08 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: ioat_enumerate_channels: xfercap = 1048576 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started RPC Bind. Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: __ioat_start_null_desc: head: 0x0 tail: 0x0 issued: 0x0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: desc[0]: (0x6b1d00000->0x6b1d00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: __ioat_issue_pending: head: 0x1 tail: 0x0 issued: 0x1 count: 0x1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: ioat_check_space_lock: num_descs: 1 (1:0:1) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 0 flags: 0x0 ctl: 0x00000000 (op: 0x0 int_en: 0 compl: 0) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 0 flags: 0x1 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: ioat_tx_submit_unlock: cookie: 2 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: __ioat_issue_pending: head: 0x2 tail: 0x0 issued: 0x2 count: 0x2 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: __cleanup: head: 0x2 tail: 0x0 issued: 0x2 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: desc[0]: (0x6b1d00000->0x6b1d00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 2 flags: 0x3 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: freeing 65536 idle descriptors Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.5: ioat_xor_val_self_test Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: ioat_enumerate_channels: xfercap = 1048576 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com auditd[1888]: No plugins found, not dispatching events Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com auditd[1888]: Init complete, auditd 3.0.7 listening for events (startup state enable) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: __ioat_start_null_desc: head: 0x0 tail: 0x0 issued: 0x0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: desc[0]: (0x6b1d00000->0x6b1d00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: __ioat_issue_pending: head: 0x1 tail: 0x0 issued: 0x1 count: 0x1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: ioat_check_space_lock: num_descs: 1 (1:0:1) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 0 flags: 0x0 ctl: 0x00000000 (op: 0x0 int_en: 0 compl: 0) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 0 flags: 0x1 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: ioat_tx_submit_unlock: cookie: 2 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: __ioat_issue_pending: head: 0x2 tail: 0x0 issued: 0x2 count: 0x2 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: ioat_get_current_completion: phys_complete: 0x6b1d00000 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: __cleanup: head: 0x2 tail: 0x0 issued: 0x2 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: desc[0]: (0x6b1d00000->0x6b1d00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: __cleanup: head: 0x2 tail: 0x1 issued: 0x2 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 2 flags: 0x3 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: freeing 65536 idle descriptors Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.6: ioat_xor_val_self_test Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: ioat_enumerate_channels: xfercap = 1048576 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_init: Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fa0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2fa0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fa0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2fa0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fa0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f60 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2f60 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f60 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2f60 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f60 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fa8 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2fa8 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fa8 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2fa8 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fa8 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f71 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2f71 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f71 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2f71 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f71 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2faa Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2faa Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2faa Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2faa Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2faa Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fab Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2fab Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fab Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2fab Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fab Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fac Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fad Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f68 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2f68 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f68 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2f68 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f68 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f79 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2f79 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f79 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2f79 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f79 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f6a Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2f6a Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f6a Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2f6a Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f6a Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f6b Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2f6b Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f6b Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2f6b Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f6b Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f6c Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2f6d Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2ffc Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2ffc Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2ffc Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2ffc Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2ffc Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2ffd Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2ffd Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2ffd Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2ffd Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2ffd Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fbd Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2fbd Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fbd Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2fbd Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fbd Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fbf Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2fbf Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fbf Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2fbf Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fbf Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fb9 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2fb9 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fb9 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2fb9 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fb9 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fbb Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2fbb Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fbb Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_get_onedevice: Detected 8086:2fbb Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Seeking for: PCI ID 8086:2fbb Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_probe: Registering MC#0 (1 of 4) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_mc_alloc: allocating 3488 bytes for mci data (18 dimms, 18 csrows/channels) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_register_mci: MC: mci = 0000000032d2bb40, dev = 00000000994005ef Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.18.0 with dev = 000000009edecc7b Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.19.0 with dev = 000000009e13fa41 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.19.1 with dev = 0000000015ebd4f2 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.19.2 with dev = 00000000b15faf6b Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.19.3 with dev = 00000000aae6056a Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.15.4 with dev = 00000000e5bfe8f4 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.15.5 with dev = 0000000076c00b54 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.20.5 with dev = 00000000ade17971 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.20.7 with dev = 0000000004d8fc61 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.23.5 with dev = 000000005e02cb73 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.23.7 with dev = 000000007cba91ba Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_dimm_config: mc#0: Node ID: 0, source ID: 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_dimm_config: Memory mirroring is disabled Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_dimm_config: Lockstep is disabled Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_dimm_config: address map is on open page mode Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: __populate_dimms: Memory is registered Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: __populate_dimms: mc#0: ha 0 channel 0, dimm 0, 8192 MiB (2097152 pages) bank: 16, rank: 1, row: 0x10000, col: 0x400 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: __populate_dimms: mc#0: ha 0 channel 1, dimm 0, 8192 MiB (2097152 pages) bank: 16, rank: 1, row: 0x10000, col: 0x400 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TOLM: 2.000 GB (0x000000007fffffff) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TOHM: 49.968 GB (0x0000000c7dffffff) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#0 DRAM up to 18.000 GB (0x0000000480000000) Interleave: [8:6]XOR[18:16] reg=0x040047c1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#0, interleave #0: 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#1 DRAM up to 26.000 GB (0x0000000680000000) Interleave: [8:6]XOR[18:16] reg=0x040067c1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#1, interleave #0: 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#2 DRAM up to 42.000 GB (0x0000000a80000000) Interleave: [8:6]XOR[18:16] reg=0x0400a7c1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#2, interleave #0: 1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#3 DRAM up to 50.000 GB (0x0000000c80000000) Interleave: [8:6]XOR[18:16] reg=0x0400c7c1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#3, interleave #0: 1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD#0: up to 2.000 GB (0x0000000080000000), socket interleave 2, memory interleave 2, TGT: 0, 1, 0, 0, reg=0x0001f504 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD#1: up to 18.000 GB (0x0000000480000000), socket interleave 2, memory interleave 2, TGT: 0, 1, 0, 0, reg=0x0011f504 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD#2: up to 26.000 GB (0x0000000680000000), socket interleave 1, memory interleave 2, TGT: 0, 1, 0, 0, reg=0x0019f104 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD CH#0, offset #0: 0.000 GB (0x0000000000000000), reg=0x00000000 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD CH#0, offset #1: 2.000 GB (0x0000000080000000), reg=0x00000800 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD CH#0, offset #2: 10.000 GB (0x0000000280000000), reg=0x00002800 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD CH#1, offset #0: 0.000 GB (0x0000000000000000), reg=0x00000000 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD CH#1, offset #1: 2.000 GB (0x0000000080000000), reg=0x00000800 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD CH#1, offset #2: 10.000 GB (0x0000000280000000), reg=0x00002800 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: CH#0 RIR#0, limit: 7.999 GB (0x00000001fff00000), way: 1, reg=0x8000001e Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: CH#0 RIR#0 INTL#0, offset 0.000 GB (0x0000000000000000), tgt: 0, reg=0x00000000 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: CH#1 RIR#0, limit: 7.999 GB (0x00000001fff00000), way: 1, reg=0x8000001e Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: CH#1 RIR#0 INTL#0, offset 0.000 GB (0x0000000000000000), tgt: 0, reg=0x00000000 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_mc_add_mc_with_groups: Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_create_sysfs_mci_device: device mc0 created Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_create_dimm_object: device dimm0 created at location channel 0 slot 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_create_dimm_object: device dimm3 created at location channel 1 slot 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_create_csrow_object: device csrow0 created Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC MC0: Giving out device to module sb_edac controller Haswell SrcID#0_Ha#0: DEV 0000:7f:12.0 (INTERRUPT) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_probe: Registering MC#1 (2 of 4) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_mc_alloc: allocating 3488 bytes for mci data (18 dimms, 18 csrows/channels) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_register_mci: MC: mci = 000000009f61ef12, dev = 0000000070e611af Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.18.0 with dev = 00000000bcc3c6ea Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.19.0 with dev = 00000000f41b08da Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.19.1 with dev = 00000000723f1eac Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.19.2 with dev = 0000000056b57b74 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.19.3 with dev = 000000009a88a00b Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.15.4 with dev = 000000003a3af3fd Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.15.5 with dev = 0000000073247d81 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.20.5 with dev = 000000004a8fbda8 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.20.7 with dev = 000000002c71f5e7 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.23.5 with dev = 00000000f3fa00a2 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.23.7 with dev = 00000000ced0d112 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_dimm_config: mc#1: Node ID: 1, source ID: 1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_dimm_config: Memory mirroring is disabled Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_dimm_config: Lockstep is disabled Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_dimm_config: address map is on open page mode Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: __populate_dimms: Memory is registered Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: __populate_dimms: mc#1: ha 0 channel 0, dimm 0, 8192 MiB (2097152 pages) bank: 16, rank: 1, row: 0x10000, col: 0x400 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: __populate_dimms: mc#1: ha 0 channel 1, dimm 0, 8192 MiB (2097152 pages) bank: 16, rank: 1, row: 0x10000, col: 0x400 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TOLM: 2.000 GB (0x000000007fffffff) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TOHM: 49.968 GB (0x0000000c7dffffff) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#0 DRAM up to 18.000 GB (0x0000000480000000) Interleave: [8:6]XOR[18:16] reg=0x040047c1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#0, interleave #0: 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#1 DRAM up to 26.000 GB (0x0000000680000000) Interleave: [8:6]XOR[18:16] reg=0x040067c1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#1, interleave #0: 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#2 DRAM up to 42.000 GB (0x0000000a80000000) Interleave: [8:6]XOR[18:16] reg=0x0400a7c1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#2, interleave #0: 1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#3 DRAM up to 50.000 GB (0x0000000c80000000) Interleave: [8:6]XOR[18:16] reg=0x0400c7c1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#3, interleave #0: 1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD#0: up to 42.000 GB (0x0000000a80000000), socket interleave 2, memory interleave 2, TGT: 0, 1, 0, 0, reg=0x0029f504 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD#1: up to 50.000 GB (0x0000000c80000000), socket interleave 1, memory interleave 2, TGT: 0, 1, 0, 0, reg=0x0031f104 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD CH#0, offset #0: 26.000 GB (0x0000000680000000), reg=0x00006800 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD CH#0, offset #1: 34.000 GB (0x0000000880000000), reg=0x00008800 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD CH#1, offset #0: 26.000 GB (0x0000000680000000), reg=0x00006800 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD CH#1, offset #1: 34.000 GB (0x0000000880000000), reg=0x00008800 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: CH#0 RIR#0, limit: 7.999 GB (0x00000001fff00000), way: 1, reg=0x8000001e Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: CH#0 RIR#0 INTL#0, offset 0.000 GB (0x0000000000000000), tgt: 0, reg=0x00000000 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: CH#1 RIR#0, limit: 7.999 GB (0x00000001fff00000), way: 1, reg=0x8000001e Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: CH#1 RIR#0 INTL#0, offset 0.000 GB (0x0000000000000000), tgt: 0, reg=0x00000000 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_mc_add_mc_with_groups: Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_create_sysfs_mci_device: device mc1 created Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_create_dimm_object: device dimm0 created at location channel 0 slot 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_create_dimm_object: device dimm3 created at location channel 1 slot 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_create_csrow_object: device csrow0 created Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC MC1: Giving out device to module sb_edac controller Haswell SrcID#1_Ha#0: DEV 0000:ff:12.0 (INTERRUPT) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_probe: Registering MC#2 (3 of 4) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_mc_alloc: allocating 3488 bytes for mci data (18 dimms, 18 csrows/channels) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_register_mci: MC: mci = 00000000c700e3dd, dev = 00000000acf3c576 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.18.4 with dev = 000000002d2e5115 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.22.0 with dev = 0000000023e38387 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.22.1 with dev = 000000001a1706b3 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.22.2 with dev = 0000000020849761 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.22.3 with dev = 000000002b47839f Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.15.4 with dev = 00000000e5bfe8f4 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.15.5 with dev = 0000000076c00b54 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.20.5 with dev = 00000000ade17971 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.20.7 with dev = 0000000004d8fc61 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.23.5 with dev = 000000005e02cb73 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI 7f.23.7 with dev = 000000007cba91ba Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_dimm_config: mc#2: Node ID: 0, source ID: 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_dimm_config: Memory mirroring is disabled Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_dimm_config: Lockstep is disabled Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_dimm_config: address map is on open page mode Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: __populate_dimms: Memory is registered Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: __populate_dimms: mc#2: ha 1 channel 0, dimm 0, 8192 MiB (2097152 pages) bank: 16, rank: 1, row: 0x10000, col: 0x400 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TOLM: 2.000 GB (0x000000007fffffff) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TOHM: 49.968 GB (0x0000000c7dffffff) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#0 DRAM up to 18.000 GB (0x0000000480000000) Interleave: [8:6]XOR[18:16] reg=0x040047c1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#0, interleave #0: 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#1 DRAM up to 26.000 GB (0x0000000680000000) Interleave: [8:6]XOR[18:16] reg=0x040067c1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#1, interleave #0: 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#2 DRAM up to 42.000 GB (0x0000000a80000000) Interleave: [8:6]XOR[18:16] reg=0x0400a7c1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#2, interleave #0: 1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#3 DRAM up to 50.000 GB (0x0000000c80000000) Interleave: [8:6]XOR[18:16] reg=0x0400c7c1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#3, interleave #0: 1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD#0: up to 2.000 GB (0x0000000080000000), socket interleave 2, memory interleave 1, TGT: 0, 0, 0, 0, reg=0x0001f400 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD#1: up to 18.000 GB (0x0000000480000000), socket interleave 2, memory interleave 1, TGT: 0, 0, 0, 0, reg=0x0011f400 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD CH#0, offset #0: 0.000 GB (0x0000000000000000), reg=0x00000000 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD CH#0, offset #1: 2.000 GB (0x0000000080000000), reg=0x00000800 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: CH#0 RIR#0, limit: 7.999 GB (0x00000001fff00000), way: 1, reg=0x8000001e Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: CH#0 RIR#0 INTL#0, offset 0.000 GB (0x0000000000000000), tgt: 0, reg=0x00000000 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_mc_add_mc_with_groups: Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_create_sysfs_mci_device: device mc2 created Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_create_dimm_object: device dimm0 created at location channel 0 slot 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_create_csrow_object: device csrow0 created Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC MC2: Giving out device to module sb_edac controller Haswell SrcID#0_Ha#1: DEV 0000:7f:12.4 (INTERRUPT) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_probe: Registering MC#3 (4 of 4) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_mc_alloc: allocating 3488 bytes for mci data (18 dimms, 18 csrows/channels) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: sbridge_register_mci: MC: mci = 00000000b0929ab1, dev = 0000000030a54ce0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.18.4 with dev = 0000000028df041f Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.22.0 with dev = 000000006166e16a Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.22.1 with dev = 000000006ebb2804 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.22.2 with dev = 000000006a8e3d1a Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.22.3 with dev = 00000000107f26e5 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.15.4 with dev = 000000003a3af3fd Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.15.5 with dev = 0000000073247d81 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.20.5 with dev = 000000004a8fbda8 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.20.7 with dev = 000000002c71f5e7 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.23.5 with dev = 00000000f3fa00a2 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: haswell_mci_bind_devs: Associated PCI ff.23.7 with dev = 00000000ced0d112 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_dimm_config: mc#3: Node ID: 1, source ID: 1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_dimm_config: Memory mirroring is disabled Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_dimm_config: Lockstep is disabled Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_dimm_config: address map is on open page mode Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: __populate_dimms: Memory is registered Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: __populate_dimms: mc#3: ha 1 channel 0, dimm 0, 8192 MiB (2097152 pages) bank: 16, rank: 1, row: 0x10000, col: 0x400 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TOLM: 2.000 GB (0x000000007fffffff) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TOHM: 49.968 GB (0x0000000c7dffffff) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#0 DRAM up to 18.000 GB (0x0000000480000000) Interleave: [8:6]XOR[18:16] reg=0x040047c1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#0, interleave #0: 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#1 DRAM up to 26.000 GB (0x0000000680000000) Interleave: [8:6]XOR[18:16] reg=0x040067c1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#1, interleave #0: 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#2 DRAM up to 42.000 GB (0x0000000a80000000) Interleave: [8:6]XOR[18:16] reg=0x0400a7c1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#2, interleave #0: 1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#3 DRAM up to 50.000 GB (0x0000000c80000000) Interleave: [8:6]XOR[18:16] reg=0x0400c7c1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: SAD#3, interleave #0: 1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD#0: up to 42.000 GB (0x0000000a80000000), socket interleave 2, memory interleave 1, TGT: 0, 0, 0, 0, reg=0x0029f400 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: TAD CH#0, offset #0: 26.000 GB (0x0000000680000000), reg=0x00006800 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: CH#0 RIR#0, limit: 7.999 GB (0x00000001fff00000), way: 1, reg=0x8000001e Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: get_memory_layout: CH#0 RIR#0 INTL#0, offset 0.000 GB (0x0000000000000000), tgt: 0, reg=0x00000000 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_mc_add_mc_with_groups: Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_create_sysfs_mci_device: device mc3 created Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_create_dimm_object: device dimm0 created at location channel 0 slot 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC DEBUG: edac_create_csrow_object: device csrow0 created Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC MC3: Giving out device to module sb_edac controller Haswell SrcID#1_Ha#1: DEV 0000:ff:12.4 (INTERRUPT) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: EDAC sbridge: Ver: 1.1.2 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-udevd[1765]: could not read from '/sys/module/acpi_cpufreq/initstate': No such device Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-udevd[1784]: could not read from '/sys/module/acpi_cpufreq/initstate': No such device Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-udevd[1826]: could not read from '/sys/module/acpi_cpufreq/initstate': No such device Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-udevd[1799]: could not read from '/sys/module/acpi_cpufreq/initstate': No such device Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-udevd[1827]: could not read from '/sys/module/acpi_cpufreq/initstate': No such device Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-udevd[1779]: could not read from '/sys/module/acpi_cpufreq/initstate': No such device Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: __ioat_start_null_desc: head: 0x0 tail: 0x0 issued: 0x0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: desc[0]: (0x6b1d00000->0x6b1d00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: __ioat_issue_pending: head: 0x1 tail: 0x0 issued: 0x1 count: 0x1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: ioat_get_current_completion: phys_complete: 0x6b1d00000 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: __cleanup: head: 0x1 tail: 0x0 issued: 0x1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: desc[0]: (0x6b1d00000->0x6b1d00040) cookie: 0 flags: 0x2 ctl: 0x00000029 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: ioat_check_space_lock: num_descs: 1 (1:1:1) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 0 flags: 0x0 ctl: 0x00000000 (op: 0x0 int_en: 0 compl: 0) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 0 flags: 0x1 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: ioat_tx_submit_unlock: cookie: 2 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: __ioat_issue_pending: head: 0x2 tail: 0x1 issued: 0x2 count: 0x2 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: __cleanup: head: 0x2 tail: 0x1 issued: 0x2 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: desc[1]: (0x6b1d00040->0x6b1d00080) cookie: 2 flags: 0x3 ctl: 0x00000009 (op: 0x0 int_en: 1 compl: 1) Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: __cleanup: cancel completion timeout Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: ioat_get_current_completion: phys_complete: 0x6b1d00040 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: freeing 65536 idle descriptors Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: ioatdma 0000:80:04.7: ioat_xor_val_self_test Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-udevd[1748]: could not read from '/sys/module/acpi_cpufreq/initstate': No such device Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-udevd[1794]: could not read from '/sys/module/acpi_cpufreq/initstate': No such device Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1891]: /sbin/augenrules: No change Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: No rules Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: enabled 1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: failure 1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: pid 1888 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: rate_limit 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: backlog_limit 8192 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: lost 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: backlog 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: backlog_wait_time 60000 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: backlog_wait_time_actual 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: enabled 1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: failure 1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: pid 1888 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: rate_limit 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: backlog_limit 8192 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: lost 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: backlog 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: backlog_wait_time 60000 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: backlog_wait_time_actual 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: enabled 1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: failure 1 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: pid 1888 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: rate_limit 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: backlog_limit 8192 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: lost 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: backlog 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: backlog_wait_time 60000 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com augenrules[1902]: backlog_wait_time_actual 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: mgag200 0000:01:00.1: vgaarb: deactivate vga console Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Security Auditing Service. Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Console: switching to colour dummy device 80x25 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: [drm] Initialized mgag200 1.0.0 20110418 for 0000:01:00.1 on minor 0 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Record System Boot/Shutdown in UTMP... Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: fbcon: mgag200drmfb (fb0) is primary device Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Record System Boot/Shutdown in UTMP. Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: Console: switching to colour frame buffer device 128x48 Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: mgag200 0000:01:00.1: [drm] fb0: mgag200drmfb frame buffer device Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target System Initialization. Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started CUPS Scheduler. Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started dnf makecache --timer. Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Daily Cleanup of Temporary Directories. Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Path Units. Fri 2023-01-27 16:12:09 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on Avahi mDNS/DNS-SD Stack Activation Socket. Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on CUPS Scheduler. Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on D-Bus System Message Bus Socket. Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on Open-iSCSI iscsid Socket. Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on Open-iSCSI iscsiuio Socket. Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket. Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Socket Units. Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting D-Bus System Message Bus... Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started D-Bus System Message Bus. Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Basic System. Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Network Manager... Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com dbus-broker-lau[1921]: Ready Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Avahi mDNS/DNS-SD Stack... Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting NTP client/server... Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Found user 'avahi' (UID 70) and group 'avahi' (GID 70). Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Restore /run/initramfs on shutdown... Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started irqbalance daemon. Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Low Memory Monitor was skipped because of an unmet condition check (ConditionPathExists=/proc/pressure). Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Software RAID monitoring and management was skipped because of an unmet condition check (ConditionPathExists=/etc/mdadm.conf). Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Load CPU microcode update... Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Authorization Manager... Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: intel_rapl_common: Found RAPL domain package Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: intel_rapl_common: Found RAPL domain dram Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: intel_rapl_common: DRAM domain energy unit 15300pj Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: intel_rapl_common: Found RAPL domain package Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: intel_rapl_common: Found RAPL domain dram Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: intel_rapl_common: DRAM domain energy unit 15300pj Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Hardware RNG Entropy Gatherer Daemon. Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting System Logging Service... Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because no trigger condition checks were met. Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because no trigger condition checks were met. Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: OpenSSH rsa Server Key Generation was skipped because no trigger condition checks were met. Fri 2023-01-27 16:12:10 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target sshd-keygen.target. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target User and Group Name Lookups. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rngd[1934]: Disabling 7: PKCS11 Entropy generator (pkcs11) Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rngd[1934]: Disabling 5: NIST Network Entropy Beacon (nist) Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rngd[1934]: Disabling 9: Qrypt quantum entropy beacon (qrypt) Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rngd[1934]: Initializing available sources Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rngd[1934]: [hwrng ]: Initialization Failed Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rngd[1934]: [rdrand]: Enabling RDRAND rng support Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rngd[1934]: [rdrand]: Initialized Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rngd[1934]: [jitter]: JITTER timeout set to 5 sec Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rsyslogd[1936]: [origin software="rsyslogd" swVersion="8.2102.0-109.el9" x-pid="1936" x-info="https://www.rsyslog.com"] start Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting User Login Management... Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started System Logging Service. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rngd[1934]: [jitter]: Initializing AES buffer Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853931.1533] NetworkManager (version 1.41.8-1.el9) is starting... (boot:5a6cfe1b-d77e-4522-8306-5caca85a1cf6) Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853931.1566] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Restore /run/initramfs on shutdown. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Successfully dropped root privileges. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853931.1852] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: avahi-daemon 0.8 starting up. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: WARNING: No NSS support for mDNS detected, consider installing nss-mdns! Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Network Manager. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com polkitd[1932]: Started polkitd version 0.117 Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com chronyd[1957]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Avahi mDNS/DNS-SD Stack. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Successfully called chroot(). Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Network. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Successfully dropped remaining capabilities. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: No service file found in /etc/avahi/services. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Joining mDNS multicast group on interface lo.IPv6 with address ::1. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com chronyd[1957]: Frequency -31.655 +/- 0.227 ppm read from /var/lib/chrony/drift Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: New relevant interface lo.IPv6 for mDNS. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Joining mDNS multicast group on interface lo.IPv4 with address 127.0.0.1. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: New relevant interface lo.IPv4 for mDNS. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Network interface enumeration completed. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Registering new address record for ::1 on lo.*. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Registering new address record for 127.0.0.1 on lo.IPv4. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com chronyd[1957]: Using right/UTC timezone to obtain leap second data Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com chronyd[1957]: Loaded seccomp filter (level 2) Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Network Manager Wait Online... Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853931.2942] manager[0x563fb0a4d020]: monitoring kernel firmware directory '/lib/firmware'. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting CUPS Scheduler... Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting GSSAPI Proxy Daemon... Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting OpenSSH server daemon... Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com polkitd[1932]: Loading rules from directory /etc/polkit-1/rules.d Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com polkitd[1932]: Loading rules from directory /usr/share/polkit-1/rules.d Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com polkitd[1932]: Finished loading, compiling and executing 4 rules Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com polkitd[1932]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Dynamic System Tuning Daemon... Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started NTP client/server. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Authorization Manager. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com sshd[1984]: main: sshd: ssh-rsa algorithm is disabled Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com sshd[1984]: Server listening on 0.0.0.0 port 22. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com sshd[1984]: Server listening on :: port 22. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Wait for chrony to synchronize system clock... Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Hostname Service... Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started OpenSSH server daemon. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started GSSAPI Proxy Daemon. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-logind[1947]: New seat seat0. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started CUPS Scheduler. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-logind[1947]: Watching system buttons on /dev/input/event0 (Power Button) Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started User Login Management. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Created slice User Slice of UID 0. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting User Runtime Directory /run/user/0... Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: microcode.service: Deactivated successfully. Fri 2023-01-27 16:12:11 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Load CPU microcode update. Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished User Runtime Directory /run/user/0. Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Server startup complete. Host name is hpe-ml350gen9-01.local. Local service cookie is 2227680302. Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting User Manager for UID 0... Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Hostname Service. Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853932.2184] hostname: hostname: using hostnamed Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853932.2191] hostname: static hostname changed from (none) to "hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com" Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853932.2214] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rsyslogd[1936]: imjournal: journal files changed, reloading... [v8.2102.0-109.el9 try https://www.rsyslog.com/e/0 ] Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2025]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0) Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: RPC: Registered named UNIX socket transport module. Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: RPC: Registered udp transport module. Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: RPC: Registered tcp transport module. Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: RPC: Registered tcp NFSv4.1 backchannel transport module. Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounted RPC Pipe File System. Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target rpc_pipefs.target. Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target NFS client services. Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853932.9013] manager[0x563fb0a4d020]: rfkill: Wi-Fi hardware radio set enabled Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853932.9018] manager[0x563fb0a4d020]: rfkill: WWAN hardware radio set enabled Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch. Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853932.9353] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.41.8-1.el9/libnm-device-plugin-team.so) Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853932.9356] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853932.9359] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853932.9361] manager: Networking is enabled by state file Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853932.9495] settings: Loaded settings plugin: keyfile (internal) Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853932.9570] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.41.8-1.el9/libnm-settings-plugin-ifcfg-rh.so") Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853932.9749] dhcp: init: Using DHCP client 'internal' Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853932.9758] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853932.9822] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853932.9863] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853932.9891] device (lo): Activation: starting connection 'lo' (94866ef5-6cb7-48fb-ab4a-5501b91f402b) Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Network Manager Script Dispatcher Service... Fri 2023-01-27 16:12:12 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853932.9997] manager: (eno1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853933.0006] device (eno1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Network Manager Script Dispatcher Service. Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853933.2073] manager: (eno2): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853933.2081] device (eno2): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853933.3998] manager: (eno3): new Ethernet device (/org/freedesktop/NetworkManager/Devices/4) Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853933.4007] device (eno3): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853933.5790] manager: (eno4): new Ethernet device (/org/freedesktop/NetworkManager/Devices/5) Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853933.5800] device (eno4): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853933.7647] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853933.7657] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853933.7662] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853933.7682] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853933.7813] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853933.7819] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853933.7839] device (lo): Activation: successful, device activated. Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2025]: Queued start job for default target Main User Target. Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2025]: Created slice User Application Slice. Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2025]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2025]: Started Daily Cleanup of User's Temporary Directories. Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2025]: Reached target Paths. Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2025]: Reached target Timers. Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2025]: Starting D-Bus User Message Bus Socket... Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2025]: PipeWire PulseAudio was skipped because of an unmet condition check (ConditionUser=!root). Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2025]: Listening on PipeWire Multimedia System Socket. Fri 2023-01-27 16:12:13 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2025]: Starting Create User's Volatile Files and Directories... Fri 2023-01-27 16:12:14 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2025]: Listening on D-Bus User Message Bus Socket. Fri 2023-01-27 16:12:14 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2025]: Reached target Sockets. Fri 2023-01-27 16:12:14 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2025]: Finished Create User's Volatile Files and Directories. Fri 2023-01-27 16:12:14 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2025]: Reached target Basic System. Fri 2023-01-27 16:12:14 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2025]: Reached target Main User Target. Fri 2023-01-27 16:12:14 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[2025]: Startup finished in 1.406s. Fri 2023-01-27 16:12:14 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started User Manager for UID 0. Fri 2023-01-27 16:12:15 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rngd[1934]: [jitter]: Enabling JITTER rng support Fri 2023-01-27 16:12:15 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rngd[1934]: [jitter]: Initialized Fri 2023-01-27 16:12:15 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com rngd[1934]: Process privileges have been dropped to 2:2 Fri 2023-01-27 16:12:16 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.0 eno1: Link is up at 1000 Mbps, full duplex Fri 2023-01-27 16:12:16 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.0 eno1: Flow control is off for TX and off for RX Fri 2023-01-27 16:12:16 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: tg3 0000:02:00.0 eno1: EEE is disabled Fri 2023-01-27 16:12:16 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eno1: link becomes ready Fri 2023-01-27 16:12:16 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853936.3155] device (eno1): carrier: link connected Fri 2023-01-27 16:12:16 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853936.3174] device (eno1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Fri 2023-01-27 16:12:16 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853936.3462] policy: auto-activating connection 'eno1' (3ba78b54-a58b-41e0-92ce-a1e477373ae3) Fri 2023-01-27 16:12:16 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853936.3472] device (eno1): Activation: starting connection 'eno1' (3ba78b54-a58b-41e0-92ce-a1e477373ae3) Fri 2023-01-27 16:12:16 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853936.3477] device (eno1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Fri 2023-01-27 16:12:16 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853936.3484] manager: NetworkManager state is now CONNECTING Fri 2023-01-27 16:12:16 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853936.3489] device (eno1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Fri 2023-01-27 16:12:16 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853936.3518] device (eno1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Fri 2023-01-27 16:12:16 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853936.3769] dhcp4 (eno1): activation: beginning transaction (timeout in 45 seconds) Fri 2023-01-27 16:12:16 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Joining mDNS multicast group on interface eno1.IPv6 with address fe80::9eb6:54ff:feaf:de74. Fri 2023-01-27 16:12:16 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: New relevant interface eno1.IPv6 for mDNS. Fri 2023-01-27 16:12:16 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Registering new address record for fe80::9eb6:54ff:feaf:de74 on eno1.*. Fri 2023-01-27 16:12:18 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853938.4541] policy: set 'eno1' (eno1) as default for IPv6 routing and DNS Fri 2023-01-27 16:12:18 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Leaving mDNS multicast group on interface eno1.IPv6 with address fe80::9eb6:54ff:feaf:de74. Fri 2023-01-27 16:12:18 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Joining mDNS multicast group on interface eno1.IPv6 with address 2620:52:0:10d8:9eb6:54ff:feaf:de74. Fri 2023-01-27 16:12:18 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Registering new address record for 2620:52:0:10d8:9eb6:54ff:feaf:de74 on eno1.*. Fri 2023-01-27 16:12:18 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Withdrawing address record for fe80::9eb6:54ff:feaf:de74 on eno1. Fri 2023-01-27 16:12:18 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Withdrawing address record for ::1 on lo. Fri 2023-01-27 16:12:18 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Withdrawing address record for 127.0.0.1 on lo. Fri 2023-01-27 16:12:18 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Host name conflict, retrying with hpe-ml350gen9-01-2 Fri 2023-01-27 16:12:18 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Registering new address record for 2620:52:0:10d8:9eb6:54ff:feaf:de74 on eno1.*. Fri 2023-01-27 16:12:18 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Registering new address record for ::1 on lo.*. Fri 2023-01-27 16:12:18 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Registering new address record for 127.0.0.1 on lo.IPv4. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853940.2203] device (eno1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed') Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853940.2258] device (eno1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed') Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853940.2265] device (eno1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed') Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853940.2283] manager: NetworkManager state is now CONNECTED_SITE Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853940.2290] device (eno1): Activation: successful, device activated. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853940.2301] manager: NetworkManager state is now CONNECTED_GLOBAL Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853940.2312] manager: startup complete Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Network Manager Wait Online. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Network is Online. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Anaconda Monitoring (anamon) post-boot notification program. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Login and scanning of iSCSI devices was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/var/lib/iscsi/nodes). Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Preparation for Remote File Systems. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Mounting /var/crash... Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Server startup complete. Host name is hpe-ml350gen9-01-2.local. Local service cookie is 2227680302. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Notify NFS peers of a restart... Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com sm-notify[2355]: Version 2.5.4 starting Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com mount[2356]: mount.nfs: Failed to resolve server kdump.usersys.redhat.com: Name or service not known Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Notify NFS peers of a restart. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: var-crash.mount: Mount process exited, code=exited, status=32/n/a Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: var-crash.mount: Failed with result 'exit-code'. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Failed to mount /var/crash. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Dependency failed for Remote File Systems. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: remote-fs.target: Job remote-fs.target/start failed with result 'dependency'. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Crash recovery kernel arming... Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Permit User Sessions... Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Permit User Sessions. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Deferred execution scheduler. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Getty on tty1. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Serial Getty on ttyS1. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Login Prompts. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: anamon.service: Deactivated successfully. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: anamon.service: Unit process 2373 (anamon) remains running after unit stopped. Fri 2023-01-27 16:12:20 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: anamon.service: Unit process 2379 (journalctl) remains running after unit stopped. Fri 2023-01-27 16:12:21 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853941.0354] dhcp4 (eno1): state changed new lease, address=10.16.216.163 Fri 2023-01-27 16:12:21 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com NetworkManager[1924]: [1674853941.0367] policy: set 'eno1' (eno1) as default for IPv4 routing and DNS Fri 2023-01-27 16:12:21 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Joining mDNS multicast group on interface eno1.IPv4 with address 10.16.216.163. Fri 2023-01-27 16:12:21 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: New relevant interface eno1.IPv4 for mDNS. Fri 2023-01-27 16:12:21 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com avahi-daemon[1926]: Registering new address record for 10.16.216.163 on eno1.IPv4. Fri 2023-01-27 16:12:21 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdumpctl[2368]: kdump: Trying to use 5.14.0-246.rt14.245.1955_759844798.el9.x86_64. Fri 2023-01-27 16:12:21 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdumpctl[2368]: kdump: Fallback to using debug kernel Fri 2023-01-27 16:12:21 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdumpctl[2368]: kdump: Using debug kernel, you may need to set a larger crashkernel than the default value. Fri 2023-01-27 16:12:25 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com chronyd[1957]: Selected source 10.11.160.238 Fri 2023-01-27 16:12:25 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com chronyd[1957]: System clock TAI offset set to 37 seconds Fri 2023-01-27 16:12:26 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PKCS7: Message signed outside of X.509 validity window Fri 2023-01-27 16:12:27 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdumpctl[2368]: kdump: kexec: loaded kdump kernel Fri 2023-01-27 16:12:27 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdumpctl[2368]: kdump: Starting kdump: [OK] Fri 2023-01-27 16:12:27 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Crash recovery kernel arming. Fri 2023-01-27 16:12:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2720]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:28 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-rc2a9b36161e347e5b024cda3cdc754ad.service: Deactivated successfully. Fri 2023-01-27 16:12:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2731]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-rb75fd67ee71142cf80808601cc054903.service: Deactivated successfully. Fri 2023-01-27 16:12:29 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Dynamic System Tuning Daemon. Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Wait for chrony to synchronize system clock. Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target System Time Synchronized. Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: logrotate.timer: Not using persistent file timestamp Mon 2038-01-18 22:14:06 EST as it is in the future. Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Daily rotation of log files. Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Timer Units. Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Command Scheduler. Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting The restraint harness.... Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com crond[2735]: (CRON) STARTUP (1.5.7) Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com crond[2735]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 73% if used.) Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com crond[2735]: (CRON) INFO (running with inotify support) Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started The restraint harness.. Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Reached target Multi-User System. Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Starting Record Runlevel Change in UTMP... Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com restraintd[2738]: Listening on http://localhost:8081 Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Finished Record Runlevel Change in UTMP. Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Startup finished in 11.338s (kernel) + 26.853s (initrd) + 33.513s (userspace) = 1min 11.705s. Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2744]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com restraintd[2738]: * Fetching recipe: http://lab-02.hosts.prod.psi.bos.redhat.com:8000//recipes/13295873/ Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r6cf942df50a3427991e0d797814550b7.service: Deactivated successfully. Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com restraintd[2738]: Ignoring Server Running state Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com restraintd[2738]: * Parsing recipe Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com restraintd[2738]: * Running recipe Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com restraintd[2738]: ** Continuing task: 155484979 [/mnt/tests/github.com/beaker-project/beaker-core-tasks/archive/master.tar.gz/reservesys] Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com restraintd[2738]: ** Preparing metadata Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com restraintd[2738]: ** Refreshing peer role hostnames: Retries 0 Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com restraintd[2738]: ** Updating env vars Fri 2023-01-27 16:12:30 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com restraintd[2738]: *** Current Time: Fri Jan 27 16:12:30 2023 Localwatchdog at: * Disabled! * Fri 2023-01-27 16:12:31 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:31 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com restraintd[2738]: ** Running task: 155484979 [/distribution/reservesys] Fri 2023-01-27 16:12:31 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2752]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:31 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-rafd13f84da154a53b041a746fc551f1f.service: Deactivated successfully. Fri 2023-01-27 16:12:31 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Fri 2023-01-27 16:12:31 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: NetworkManager-dispatcher.service: Consumed 1.638s CPU time. Fri 2023-01-27 16:12:31 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:31 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2779]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:31 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-rdd621c02da4b44e384dc929ba318506f.service: Deactivated successfully. Fri 2023-01-27 16:12:32 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:32 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2794]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:32 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r70764f8d452a46a99563772c52b90b65.service: Deactivated successfully. Fri 2023-01-27 16:12:32 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2732]: Timed out for waiting the udev queue being empty. Fri 2023-01-27 16:12:32 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:32 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2801]: kdump: kexec: unloaded kdump kernel Fri 2023-01-27 16:12:32 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2801]: kdump: Stopping kdump: [OK] Fri 2023-01-27 16:12:32 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2801]: kdump: Trying to use 5.14.0-246.rt14.245.1955_759844798.el9.x86_64. Fri 2023-01-27 16:12:32 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2801]: kdump: Fallback to using debug kernel Fri 2023-01-27 16:12:32 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2801]: kdump: Using debug kernel, you may need to set a larger crashkernel than the default value. Fri 2023-01-27 16:12:33 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:33 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2851]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:33 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r8e296a72b6414ca2bca74a5a74f6f0c5.service: Deactivated successfully. Fri 2023-01-27 16:12:34 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com unknown: Running test [R:13295873 T:155484979 - /distribution/reservesys - Kernel: 5.14.0-246.rt14.245.1955_759844798.el9.x86_64+debug] Fri 2023-01-27 16:12:34 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PKCS7: Message signed outside of X.509 validity window Fri 2023-01-27 16:12:34 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:34 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2970]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:34 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r37c1013aa06f452992eff2acef0e11c4.service: Deactivated successfully. Fri 2023-01-27 16:12:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2982]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-rba584ec1fba54932b8308562e67b3915.service: Deactivated successfully. Fri 2023-01-27 16:12:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2801]: kdump: kexec: loaded kdump kernel Fri 2023-01-27 16:12:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2801]: kdump: Starting kdump: [OK] Fri 2023-01-27 16:12:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r27591fae03914c87a9d090d662279b45.service: Deactivated successfully. Fri 2023-01-27 16:12:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r27591fae03914c87a9d090d662279b45.service: Consumed 3.503s CPU time. Fri 2023-01-27 16:12:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2989]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:35 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-re12fcec1b80e4e7399abf7ccddb01c7b.service: Deactivated successfully. Fri 2023-01-27 16:12:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:36 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2993]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-ra3dcb4e8b88a4dfe9cf4727949b670b1.service: Deactivated successfully. Fri 2023-01-27 16:12:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2892]: Timed out for waiting the udev queue being empty. Fri 2023-01-27 16:12:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2997]: kdump: kexec: unloaded kdump kernel Fri 2023-01-27 16:12:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2997]: kdump: Stopping kdump: [OK] Fri 2023-01-27 16:12:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2997]: kdump: Trying to use 5.14.0-246.rt14.245.1955_759844798.el9.x86_64. Fri 2023-01-27 16:12:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2997]: kdump: Fallback to using debug kernel Fri 2023-01-27 16:12:37 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2997]: kdump: Using debug kernel, you may need to set a larger crashkernel than the default value. Fri 2023-01-27 16:12:38 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:39 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:39 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3115]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:39 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r7c5b2ffe413a40e58e55b2f672c1d6a5.service: Deactivated successfully. Fri 2023-01-27 16:12:39 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PKCS7: Message signed outside of X.509 validity window Fri 2023-01-27 16:12:39 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:39 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3129]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:39 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-ra137702517504fefb44749a8cffb0fb8.service: Deactivated successfully. Fri 2023-01-27 16:12:40 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2997]: kdump: kexec: loaded kdump kernel Fri 2023-01-27 16:12:40 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[2997]: kdump: Starting kdump: [OK] Fri 2023-01-27 16:12:40 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-reaf5cbaa90924d7294476dd4bc5421d8.service: Deactivated successfully. Fri 2023-01-27 16:12:40 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-reaf5cbaa90924d7294476dd4bc5421d8.service: Consumed 3.460s CPU time. Fri 2023-01-27 16:12:40 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:40 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3139]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:40 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-re562f083017e4e378ff67aa94a1b0f1a.service: Deactivated successfully. Fri 2023-01-27 16:12:40 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:40 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3143]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:40 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-rceec3a0a18374aa082818f084a209708.service: Deactivated successfully. Fri 2023-01-27 16:12:42 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3131]: Timed out for waiting the udev queue being empty. Fri 2023-01-27 16:12:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3147]: kdump: kexec: unloaded kdump kernel Fri 2023-01-27 16:12:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3147]: kdump: Stopping kdump: [OK] Fri 2023-01-27 16:12:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3147]: kdump: Trying to use 5.14.0-246.rt14.245.1955_759844798.el9.x86_64. Fri 2023-01-27 16:12:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3147]: kdump: Fallback to using debug kernel Fri 2023-01-27 16:12:43 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3147]: kdump: Using debug kernel, you may need to set a larger crashkernel than the default value. Fri 2023-01-27 16:12:44 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:44 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:45 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3274]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:45 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r67893124cab5452783e5cae0d88ebb54.service: Deactivated successfully. Fri 2023-01-27 16:12:45 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PKCS7: Message signed outside of X.509 validity window Fri 2023-01-27 16:12:45 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3147]: kdump: kexec: loaded kdump kernel Fri 2023-01-27 16:12:45 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3147]: kdump: Starting kdump: [OK] Fri 2023-01-27 16:12:45 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r1ab58340b67f4edeb78428017f726f17.service: Deactivated successfully. Fri 2023-01-27 16:12:45 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r1ab58340b67f4edeb78428017f726f17.service: Consumed 3.469s CPU time. Fri 2023-01-27 16:12:46 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:46 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3286]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:46 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-rb837ba5284264deabec117b58b9aa995.service: Deactivated successfully. Fri 2023-01-27 16:12:46 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:47 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3290]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:47 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-radab33b994f141508ce8fb8307b403ca.service: Deactivated successfully. Fri 2023-01-27 16:12:47 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:47 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3294]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:47 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r583675945e4e4d5b83f87ba474b5d703.service: Deactivated successfully. Fri 2023-01-27 16:12:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3298]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-rcf4da73bb57c49998e373f95d8a0a9a6.service: Deactivated successfully. Fri 2023-01-27 16:12:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3302]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-rf310906af7ce40029fa9794acd36a450.service: Deactivated successfully. Fri 2023-01-27 16:12:48 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3279]: Timed out for waiting the udev queue being empty. Fri 2023-01-27 16:12:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3306]: kdump: kexec: unloaded kdump kernel Fri 2023-01-27 16:12:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3306]: kdump: Stopping kdump: [OK] Fri 2023-01-27 16:12:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3306]: kdump: Trying to use 5.14.0-246.rt14.245.1955_759844798.el9.x86_64. Fri 2023-01-27 16:12:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3306]: kdump: Fallback to using debug kernel Fri 2023-01-27 16:12:49 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3306]: kdump: Using debug kernel, you may need to set a larger crashkernel than the default value. Fri 2023-01-27 16:12:50 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com sshd[3372]: main: sshd: ssh-rsa algorithm is disabled Fri 2023-01-27 16:12:50 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:50 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com sshd[3372]: Accepted publickey for root from 10.8.0.181 port 55392 ssh2: RSA SHA256:KxCuG8Dsw1ul66pG2cyRs5IdO3BAiD7LOITSP7nJGNk Fri 2023-01-27 16:12:50 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:50 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3427]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:50 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd-logind[1947]: New session 2 of user root. Fri 2023-01-27 16:12:50 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-rc882d4faafb940a79677485ded039efc.service: Deactivated successfully. Fri 2023-01-27 16:12:50 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started Session 2 of User root. Fri 2023-01-27 16:12:50 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com sshd[3372]: pam_unix(sshd:session): session opened for user root(uid=0) by (uid=0) Fri 2023-01-27 16:12:51 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:51 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PKCS7: Message signed outside of X.509 validity window Fri 2023-01-27 16:12:51 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3447]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:51 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-rdc95f4d943004f659fb55aca41d1e690.service: Deactivated successfully. Fri 2023-01-27 16:12:51 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:51 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3470]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:51 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r92d9f830eeaf4f5a9f70c036587b51f4.service: Deactivated successfully. Fri 2023-01-27 16:12:51 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3306]: kdump: kexec: loaded kdump kernel Fri 2023-01-27 16:12:51 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3306]: kdump: Starting kdump: [OK] Fri 2023-01-27 16:12:51 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r5ee36638bab14a81913bf7b77d8e810e.service: Deactivated successfully. Fri 2023-01-27 16:12:51 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r5ee36638bab14a81913bf7b77d8e810e.service: Consumed 3.458s CPU time. Fri 2023-01-27 16:12:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3487]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:52 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r87f3f261f83f44199064143ec51fdd9e.service: Deactivated successfully. Fri 2023-01-27 16:12:53 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:53 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3515]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:53 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r9712ddf56c3e4cff9d2ae32e25441932.service: Deactivated successfully. Fri 2023-01-27 16:12:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3539]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r45cd01d2520f431fabd40a7e74186890.service: Deactivated successfully. Fri 2023-01-27 16:12:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3468]: Timed out for waiting the udev queue being empty. Fri 2023-01-27 16:12:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3551]: kdump: kexec: unloaded kdump kernel Fri 2023-01-27 16:12:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3551]: kdump: Stopping kdump: [OK] Fri 2023-01-27 16:12:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3551]: kdump: Trying to use 5.14.0-246.rt14.245.1955_759844798.el9.x86_64. Fri 2023-01-27 16:12:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3551]: kdump: Fallback to using debug kernel Fri 2023-01-27 16:12:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3551]: kdump: Using debug kernel, you may need to set a larger crashkernel than the default value. Fri 2023-01-27 16:12:54 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3576]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-rb1430929cfe84cde96f7be8fec622bad.service: Deactivated successfully. Fri 2023-01-27 16:12:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3602]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r54cdfbbc1ed34c3b9739fbae5cf5d91a.service: Deactivated successfully. Fri 2023-01-27 16:12:55 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com unknown: Running test [R:13295873 T:10 - Reboot test - Kernel: 5.14.0-246.rt14.245.1955_759844798.el9.x86_64+debug] Fri 2023-01-27 16:12:56 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:56 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3679]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:56 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r958e730a171c475989eb0c64a0e62b72.service: Deactivated successfully. Fri 2023-01-27 16:12:56 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:56 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3710]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:56 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r519c24f5b9ee489dbe6e4fe7089d6e66.service: Deactivated successfully. Fri 2023-01-27 16:12:56 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kernel: PKCS7: Message signed outside of X.509 validity window Fri 2023-01-27 16:12:57 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:57 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3762]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:57 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r9a49c5c2e0134ad58df8147861610e65.service: Deactivated successfully. Fri 2023-01-27 16:12:57 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:57 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3766]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:57 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-ra39f57468fd2488b9351947385acfcbc.service: Deactivated successfully. Fri 2023-01-27 16:12:57 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3551]: kdump: kexec: loaded kdump kernel Fri 2023-01-27 16:12:57 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3551]: kdump: Starting kdump: [OK] Fri 2023-01-27 16:12:57 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r948ef30366374dee9ab0e09276ce5789.service: Deactivated successfully. Fri 2023-01-27 16:12:57 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r948ef30366374dee9ab0e09276ce5789.service: Consumed 3.486s CPU time. Fri 2023-01-27 16:12:58 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:58 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3783]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:58 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r6bd1fa452ce441a2968354a405ca7208.service: Deactivated successfully. Fri 2023-01-27 16:12:58 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:58 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3796]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:58 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r7954bbb06be54d209b39d900732e2c89.service: Deactivated successfully. Fri 2023-01-27 16:12:59 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3650]: Timed out for waiting the udev queue being empty. Fri 2023-01-27 16:12:59 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3812]: kdump: kexec: unloaded kdump kernel Fri 2023-01-27 16:12:59 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3812]: kdump: Stopping kdump: [OK] Fri 2023-01-27 16:12:59 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:59 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3812]: kdump: Trying to use 5.14.0-246.rt14.245.1955_759844798.el9.x86_64. Fri 2023-01-27 16:12:59 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3812]: kdump: Fallback to using debug kernel Fri 2023-01-27 16:12:59 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:59 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3812]: kdump: Using debug kernel, you may need to set a larger crashkernel than the default value. Fri 2023-01-27 16:12:59 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3850]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:59 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-re2d6318ada0147a1b0540e4d6a7accae.service: Deactivated successfully. Fri 2023-01-27 16:12:59 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:59 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3872]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:59 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-reb1a2cbda6454e5a9c878e672f435cb3.service: Deactivated successfully. Fri 2023-01-27 16:12:59 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:12:59 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3889]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:12:59 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r17c028de0f7e4af6975516de2bde5f6b.service: Deactivated successfully. Fri 2023-01-27 16:12:59 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:13:00 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3907]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:13:00 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-r958628b9c6d9483995b5ddb26801ec28.service: Deactivated successfully. Fri 2023-01-27 16:13:00 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: Started /usr/lib/udev/kdump-udev-throttler. Fri 2023-01-27 16:13:00 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com kdump-udev-throttler[3962]: Throttling kdump restart for concurrent udev event Fri 2023-01-27 16:13:00 EST hpe-ml350gen9-01.hpe2.lab.eng.bos.redhat.com systemd[1]: run-re6c37da86bb647b9b86d6ade8bed640d.service: Deactivated successfully.