Currently when a model generates output which looks like a tool call, but is invalid an exception is thrown and not handled, causing the cli or llama-server to bail. Instead, handle the chat parser exception and simply return the generated text in such cases. Signed-off-by: Piotr Stankiewicz <piotr.stankiewicz@docker.com> |
||
|---|---|---|
| .. | ||
| arg.cpp | ||
| arg.h | ||
| base64.hpp | ||
| build-info.cpp.in | ||
| chat-parser.cpp | ||
| chat-parser.h | ||
| chat.cpp | ||
| chat.h | ||
| CMakeLists.txt | ||
| common.cpp | ||
| common.h | ||
| console.cpp | ||
| console.h | ||
| json-partial.cpp | ||
| json-partial.h | ||
| json-schema-to-grammar.cpp | ||
| json-schema-to-grammar.h | ||
| llguidance.cpp | ||
| log.cpp | ||
| log.h | ||
| ngram-cache.cpp | ||
| ngram-cache.h | ||
| regex-partial.cpp | ||
| regex-partial.h | ||
| sampling.cpp | ||
| sampling.h | ||
| speculative.cpp | ||
| speculative.h | ||