llama_cpp_for_radxa_dragon_.../examples
tastelikefeet b2034c2b55
contrib: support modelscope community (#12664)
* support download from modelscope

* support login

* remove comments

* add arguments

* fix code

* fix win32

* test passed

* fix readme

* revert readme

* change to MODEL_ENDPOINT

* revert tail line

* fix readme

* refactor model endpoint

* remove blank line

* fix header

* fix as comments

* update comment

* update readme

---------

Co-authored-by: tastelikefeet <yuze.zyz@alibaba-inc/com>
2025-04-11 14:01:56 +02:00
..
batched common : refactor downloading system, handle mmproj with -hf option (#12694) 2025-04-01 23:44:05 +02:00
batched-bench common : refactor downloading system, handle mmproj with -hf option (#12694) 2025-04-01 23:44:05 +02:00
batched.swift
convert-llama2c-to-ggml
cvector-generator
deprecation-warning
embedding
eval-callback
export-lora common : refactor downloading system, handle mmproj with -hf option (#12694) 2025-04-01 23:44:05 +02:00
gbnf-validator
gen-docs
gguf
gguf-hash
gguf-split gguf-split : --merge now respects --dry-run option (#12681) 2025-04-04 16:09:12 +02:00
gritlm common : refactor downloading system, handle mmproj with -hf option (#12694) 2025-04-01 23:44:05 +02:00
imatrix
infill
jeopardy
llama-bench
llama.android cmake : enable curl by default (#12761) 2025-04-07 13:35:19 +02:00
llama.swiftui
llava clip : use smart pointer (⚠️ breaking change) (#12869) 2025-04-11 12:09:39 +02:00
lookahead
lookup
main
parallel llama : refactor kv cache guard (#12695) 2025-04-02 14:32:59 +03:00
passkey common : refactor downloading system, handle mmproj with -hf option (#12694) 2025-04-01 23:44:05 +02:00
perplexity hellaswag: display estimated score confidence interval (#12797) 2025-04-07 18:47:08 +03:00
quantize
quantize-stats
retrieval
rpc rpc : update README for cache usage (#12620) 2025-03-28 09:44:13 +02:00
run contrib: support modelscope community (#12664) 2025-04-11 14:01:56 +02:00
save-load-state
server ci: detach common from the library (#12827) 2025-04-09 10:11:11 +02:00
simple
simple-chat
simple-cmake-pkg
speculative common : refactor downloading system, handle mmproj with -hf option (#12694) 2025-04-01 23:44:05 +02:00
speculative-simple common : refactor downloading system, handle mmproj with -hf option (#12694) 2025-04-01 23:44:05 +02:00
sycl cmake : enable curl by default (#12761) 2025-04-07 13:35:19 +02:00
tokenize
tts common : refactor downloading system, handle mmproj with -hf option (#12694) 2025-04-01 23:44:05 +02:00
chat-13B.bat
chat-13B.sh
chat-persistent.sh
chat-vicuna.sh
chat.sh
CMakeLists.txt
convert_legacy_llama.py
json_schema_pydantic_example.py
json_schema_to_grammar.py
llama.vim
llm.vim
Miku.sh
pydantic_models_to_grammar.py
pydantic_models_to_grammar_examples.py
reason-act.sh
regex_to_grammar.py
server-llama2-13B.sh
server_embd.py llama : fix FA when KV cache is not used (i.e. embeddings) (#12825) 2025-04-08 19:54:51 +03:00
ts-type-to-grammar.sh