Replies: 2 comments
-
this is also a requirement for OpenAI Codex |
Beta Was this translation helpful? Give feedback.
0 replies
-
If implemented this may resolve issues with Agent calls from n8n n8n-io/n8n#13112 The current version of OpenAI API agent calls via the n8n chain results in the same error output when called as |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
5ire, the most recommended open source MCP client requires streaming and tool use.
however, llama_server doesn't allow this:
llama.cpp/examples/server/utils.hpp
Line 565 in f17a3bb
it would be great to be able to use this tool with llama.cpp directly.
Beta Was this translation helpful? Give feedback.
All reactions