We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
llama-server --version version: 5269 (1d36b36) built with MSVC 19.43.34808.0 for x64
No response
llama-server
The server webui overrides the launch command parameters with its own saved settings. I can only see this as a bug.
Expected behavior would be to pass any explicitly given command line parameters to the webui and override its previously stored values instead.
The current behavior makes the webui needlessly hard to use when swapping models.
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Name and Version
llama-server --version
version: 5269 (1d36b36)
built with MSVC 19.43.34808.0 for x64
Operating systems
No response
Which llama.cpp modules do you know to be affected?
llama-server
Command line
Problem description & steps to reproduce
The server webui overrides the launch command parameters with its own saved settings. I can only see this as a bug.
Expected behavior would be to pass any explicitly given command line parameters to the webui and override its previously stored values instead.
The current behavior makes the webui needlessly hard to use when swapping models.
First Bad Commit
No response
Relevant log output
The text was updated successfully, but these errors were encountered: