Skip to content

Logs are missing in remote attach STDOUT with Logs var set to true with k8s-file log driver #26951

@NotSoFancyName

Description

@NotSoFancyName

Issue Description

Sometimes when using k8s-file log driver and trying to remotely attach to the running container with Logs option set to true in containers.AttachOptions some logs are missing. Usually it happens when the machine is under load, it also happens without any load but quite rarely.

Steps to reproduce the issue

  1. I managed to somehow reliably reproduce the issue by running this code snippet :
package main

import (
	"bytes"
	"context"
	"errors"
	"fmt"
	"log"
	"os"
	"strings"
	"time"

	"github.com/containers/podman/v5/libpod/define"
	"github.com/containers/podman/v5/pkg/bindings"
	"github.com/containers/podman/v5/pkg/bindings/containers"
	"github.com/containers/podman/v5/pkg/specgen"
)

const (
	logFieName = "test.log"

	missingLog = "I am a missing log"
)

var (
	trueVar                = true
	containerRemoveTimeout = uint(1)
)

func main() {
	connection, err := bindings.NewConnection(context.Background(), "unix:///run/user/1000/podman/podman.sock")
	if err != nil {
		log.Fatal(err)
	}

	dir, err := os.Getwd()
	if err != nil {
		log.Fatal(err)
	}
	logFilePath := dir + "/" + logFieName

	cnt := 0
	for {
		func() {
			cnt++
			log.Println("Iteration: ", cnt)

			err = os.Remove(logFilePath)
			if err != nil && !errors.Is(err, os.ErrNotExist) {
				log.Fatal(err)
			}

			containerCreateResponse, err := containers.CreateWithSpec(connection, &specgen.SpecGenerator{
				ContainerBasicConfig: specgen.ContainerBasicConfig{
					Name:     "test-log",
					Command:  []string{"/bin/sh", "-c", fmt.Sprintf("echo '%s' && /bin/sh", missingLog)},
					Stdin:    &trueVar,
					Terminal: &trueVar,
					LogConfiguration: &specgen.LogConfig{
						Driver: "k8s-file",
						Path:   logFilePath,
						Size:   4096,
					},
				},
				ContainerStorageConfig: specgen.ContainerStorageConfig{
					Image: "alpine",
				},

				ContainerHealthCheckConfig: specgen.ContainerHealthCheckConfig{
					HealthLogDestination: define.DefaultHealthCheckLocalDestination,
					HealthMaxLogCount:    define.DefaultHealthMaxLogCount,
					HealthMaxLogSize:     define.DefaultHealthMaxLogSize,
				},
			}, &containers.CreateOptions{})
			if err != nil {
				log.Fatal(err)
			}

			defer func() {
				_, err = containers.Remove(connection, containerCreateResponse.ID, &containers.RemoveOptions{
					Force:   &trueVar,
					Ignore:  &trueVar,
					Timeout: &containerRemoveTimeout,
				})
				if err != nil {
					log.Fatal(err)
				}
			}()

			err = containers.Start(connection, containerCreateResponse.ID, &containers.StartOptions{})
			if err != nil {
				log.Fatal(err)
			}

			attachReady := make(chan bool)
			buf := bytes.Buffer{}
			go func() {
				err := containers.Attach(connection, containerCreateResponse.ID, os.Stdin, &buf, &buf, attachReady, &containers.AttachOptions{Logs: &trueVar})
				if err != nil {
					log.Fatal(err)
				}
			}()
			<-attachReady

			shellPromptMissing := true
			for i := 0; i < 100; i++ {
				if strings.Contains(buf.String(), "/ #") {
					shellPromptMissing = false
					break
				}
				time.Sleep(20 * time.Millisecond)
				continue
			}

			if shellPromptMissing {
				log.Println("Shell prompt missing")
				return
			}

			if !strings.Contains(buf.String(), missingLog) {
				log.Fatalf("Log is missing, actual STDOUT: %s", buf.String())
			}
		}()
	}
}

go run main.go

  1. I also had to simulate the load, I did it with stress-ng --cpu=0 --vm 4 --vm-bytes 98% --timeout 600s

Describe the results you received

These are the results I have recieved:

$ go run main.go 
2025/09/01 15:58:21 Iteration:  1
2025/09/01 15:58:22 Iteration:  2
2025/09/01 15:58:24 Iteration:  3
2025/09/01 15:58:28 Shell prompt missing
2025/09/01 15:58:28 Iteration:  4
2025/09/01 15:58:32 Shell prompt missing
2025/09/01 15:58:32 Iteration:  5
2025/09/01 15:58:36 Shell prompt missing
2025/09/01 15:58:36 Iteration:  6
2025/09/01 15:58:38 Iteration:  7
2025/09/01 15:58:42 Shell prompt missing
2025/09/01 15:58:42 Iteration:  8
2025/09/01 15:58:46 Shell prompt missing
2025/09/01 15:58:46 Iteration:  9
2025/09/01 15:58:50 Shell prompt missing
2025/09/01 15:58:50 Iteration:  10
2025/09/01 15:58:54 Shell prompt missing
2025/09/01 15:58:54 Iteration:  11
2025/09/01 15:58:58 Shell prompt missing
2025/09/01 15:58:58 Iteration:  12
2025/09/01 15:59:02 Shell prompt missing
2025/09/01 15:59:02 Iteration:  13
2025/09/01 15:59:04 Iteration:  14
2025/09/01 15:59:08 Shell prompt missing
2025/09/01 15:59:08 Iteration:  15
2025/09/01 15:59:12 Shell prompt missing
2025/09/01 15:59:12 Iteration:  16
2025/09/01 15:59:16 Shell prompt missing
2025/09/01 15:59:16 Iteration:  17
2025/09/01 15:59:20 Shell prompt missing
2025/09/01 15:59:20 Iteration:  18
2025/09/01 15:59:21 Iteration:  19
2025/09/01 15:59:25 Shell prompt missing
2025/09/01 15:59:25 Iteration:  20
2025/09/01 15:59:29 Shell prompt missing
2025/09/01 15:59:29 Iteration:  21
2025/09/01 15:59:34 Shell prompt missing
2025/09/01 15:59:34 Iteration:  22
2025/09/01 15:59:37 Shell prompt missing
2025/09/01 15:59:37 Iteration:  23
2025/09/01 15:59:41 Shell prompt missing
2025/09/01 15:59:41 Iteration:  24
2025/09/01 15:59:46 Shell prompt missing
2025/09/01 15:59:46 Iteration:  25
2025/09/01 15:59:50 Shell prompt missing
2025/09/01 15:59:50 Iteration:  26
2025/09/01 15:59:54 Shell prompt missing
2025/09/01 15:59:54 Iteration:  27
2025/09/01 15:59:58 Shell prompt missing
2025/09/01 15:59:58 Iteration:  28
2025/09/01 16:00:02 Shell prompt missing
2025/09/01 16:00:02 Iteration:  29
2025/09/01 16:00:04 Iteration:  30
2025/09/01 16:00:08 Shell prompt missing
2025/09/01 16:00:08 Iteration:  31
2025/09/01 16:00:10 Iteration:  32
2025/09/01 16:00:12 Log is missing, actual STDOUT: / # 
^[[49;56Rexit status 1

Actually there are two different outcomes: sometimes the shell promt does not appear at all (what is interesting is that it will not appear if the simulated load is removed) and sometimes a few logs before it are missing, as it happened during iteration 32.

In the test.log file all the logs are present

2025-09-01T16:00:12.028984623+02:00 stdout F I am a missing log
2025-09-01T16:00:12.031859104+02:00 stdout P / # �[6n

Describe the results you expected

All the logs to be printed into STDOUT if the Logs option is set to true when using attach remote call

podman info output

host:
  arch: amd64
  buildahVersion: 1.41.3
  cgroupControllers:
  - cpu
  - io
  - memory
  - pids
  cgroupManager: systemd
  cgroupVersion: v2
  conmon:
    package: conmon-2.1.13-1.fc42.x86_64
    path: /usr/bin/conmon
    version: 'conmon version 2.1.13, commit: '
  cpuUtilization:
    idlePercent: 70.32
    systemPercent: 2.63
    userPercent: 27.06
  cpus: 16
  databaseBackend: sqlite
  distribution:
    distribution: fedora
    variant: workstation
    version: "42"
  emulatedArchitectures:
  - linux/arm
  - linux/arm64
  - linux/arm64be
  - linux/loong64
  - linux/mips
  - linux/mips64
  - linux/ppc
  - linux/ppc64
  - linux/ppc64le
  - linux/riscv32
  - linux/riscv64
  - linux/s390x
  eventLogger: journald
  freeLocks: 2021
  hostname: dhcp-10-29-246-140
  idMappings:
    gidmap:
    - container_id: 0
      host_id: 1000
      size: 1
    - container_id: 1
      host_id: 524288
      size: 65536
    uidmap:
    - container_id: 0
      host_id: 1000
      size: 1
    - container_id: 1
      host_id: 524288
      size: 65536
  kernel: 6.15.10-200.fc42.x86_64
  linkmode: dynamic
  logDriver: journald
  memFree: 13187862528
  memTotal: 32914599936
  networkBackend: netavark
  networkBackendInfo:
    backend: netavark
    dns:
      package: aardvark-dns-1.16.0-1.fc42.x86_64
      path: /usr/libexec/podman/aardvark-dns
      version: aardvark-dns 1.16.0
    package: netavark-1.16.1-1.fc42.x86_64
    path: /usr/libexec/podman/netavark
    version: netavark 1.16.1
  ociRuntime:
    name: crun
    package: crun-1.23.1-1.fc42.x86_64
    path: /usr/bin/crun
    version: |-
      crun version 1.23.1
      commit: d20b23dba05e822b93b82f2f34fd5dada433e0c2
      rundir: /run/user/1000/crun
      spec: 1.0.0
      +SYSTEMD +SELINUX +APPARMOR +CAP +SECCOMP +EBPF +CRIU +LIBKRUN +WASM:wasmedge +YAJL
  os: linux
  pasta:
    executable: /usr/bin/pasta
    package: passt-0^20250805.g309eefd-2.fc42.x86_64
    version: |
      pasta 0^20250805.g309eefd-2.fc42.x86_64
      Copyright Red Hat
      GNU General Public License, version 2 or later
        <https://www.gnu.org/licenses/old-licenses/gpl-2.0.html>
      This is free software: you are free to change and redistribute it.
      There is NO WARRANTY, to the extent permitted by law.
  remoteSocket:
    exists: true
    path: /run/user/1000/podman/podman.sock
  rootlessNetworkCmd: pasta
  security:
    apparmorEnabled: false
    capabilities: CAP_CHOWN,CAP_DAC_OVERRIDE,CAP_FOWNER,CAP_FSETID,CAP_KILL,CAP_NET_BIND_SERVICE,CAP_SETFCAP,CAP_SETGID,CAP_SETPCAP,CAP_SETUID,CAP_SYS_CHROOT
    rootless: true
    seccompEnabled: true
    seccompProfilePath: /usr/share/containers/seccomp.json
    selinuxEnabled: true
  serviceIsRemote: false
  slirp4netns:
    executable: ""
    package: ""
    version: ""
  swapFree: 8534630400
  swapTotal: 8589930496
  uptime: 1h 8m 26.00s (Approximately 0.04 days)
  variant: ""
plugins:
  authorization: null
  log:
  - k8s-file
  - none
  - passthrough
  - journald
  network:
  - bridge
  - macvlan
  - ipvlan
  volume:
  - local
registries:
  search:
  - registry.fedoraproject.org
  - registry.access.redhat.com
  - docker.io
store:
  configFile: /home/vova/.config/containers/storage.conf
  containerStore:
    number: 16
    paused: 0
    running: 1
    stopped: 15
  graphDriverName: overlay
  graphOptions: {}
  graphRoot: /home/vova/.local/share/containers/storage
  graphRootAllocated: 1022488477696
  graphRootUsed: 212395405312
  graphStatus:
    Backing Filesystem: btrfs
    Native Overlay Diff: "true"
    Supports d_type: "true"
    Supports shifting: "false"
    Supports volatile: "true"
    Using metacopy: "false"
  imageCopyTmpDir: /var/tmp
  imageStore:
    number: 2653
  runRoot: /run/user/1000/containers
  transientStore: false
  volumePath: /home/vova/.local/share/containers/storage/volumes
version:
  APIVersion: 5.6.0
  BuildOrigin: Fedora Project
  Built: 1755216000
  BuiltTime: Fri Aug 15 02:00:00 2025
  GitCommit: da671ef6cfa3fc9ac6225c18f1dd0a70a951e43f
  GoVersion: go1.24.6
  Os: linux
  OsArch: linux/amd64
  Version: 5.6.0

Podman in a container

No

Privileged Or Rootless

Rootless

Upstream Latest Release

Yes

Additional environment details

No response

Additional information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    kind/bugCategorizes issue or PR as related to a bug.triagedIssue has been triaged

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions