config.json is missing
7
#6 opened about 1 year ago
by
PierreCarceller

Can't load model in LlamaCpp
👍
1
7
#4 opened over 1 year ago
by
ThoilGoyang
Seems can not use response_format in llama-cpp-python
1
#3 opened over 1 year ago
by
svjack

Another <EOS_TOKEN> issue
1
#2 opened over 1 year ago
by
alexcardo
LM Studio Error: "llama.cpp error: 'error loading model vocabulary: unknown pre-tokenizer type: 'command-r''"
1
#1 opened over 1 year ago
by
rodion-m