Skip to content

lost log files under high load (using .gz) #3648

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
dasvex opened this issue May 5, 2025 · 1 comment
Open

lost log files under high load (using .gz) #3648

dasvex opened this issue May 5, 2025 · 1 comment

Comments

@dasvex
Copy link

dasvex commented May 5, 2025

Description

In our production, entries in log files are lost under high load.
It is possible that this is due to the archiving process.

It seems that if the file archiving of one file is not completed and the second one is already starting to archive, they can overwrite each other.

Configuration

Version: [2.24.3]

Operating system: [Oracle Linux Server 8.4; win10]

JDK: [21]

Reproduction

Most likely, the problem will reproduce well on weak servers, where the file archiving process takes some considerable time. For example, on servers with a slow HDD or with a slow CPU.

Simple test to reproduce here:
https://github.com/dasvex/log4j-lost-files-on-gz/tree/main

@vy
Copy link
Member

vy commented May 13, 2025

@dasvex, thanks so much for the report and the reproduction.

These kind of issues often require significant effort (and skill set) to troubleshoot. AFAICT, currently the only active Log4j maintainers (myself, and @ppkarwasz?) are significantly limited in their maintainer time. My personal advice is to see if you can further troubleshoot the issue yourself, or seek other support options.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: To triage
Development

No branches or pull requests

2 participants