Update README.md
Browse files
README.md
CHANGED
@@ -7,7 +7,10 @@ base_model:
|
|
7 |
# huihui-ai/DeepSeek-V3-0324-bf16
|
8 |
This model converted from DeepSeek-V3-0324 to BF16.
|
9 |
Therefore, we have provided only the command to convert from Windows and information related to ollama.
|
|
|
|
|
10 |
If you are in a Linux or WSL environment, please refer to [huihui-ai/DeepSeek-R1-bf16](https://huggingface.co/huihui-ai/DeepSeek-R1-bf16).
|
|
|
11 |
If needed, we can upload the bf16 version.
|
12 |
|
13 |
## FP8 to BF16
|
|
|
7 |
# huihui-ai/DeepSeek-V3-0324-bf16
|
8 |
This model converted from DeepSeek-V3-0324 to BF16.
|
9 |
Therefore, we have provided only the command to convert from Windows and information related to ollama.
|
10 |
+
The Windows environment is much faster than the WSL environment, provided you have sufficient memory or virtual memory. The Linux environment hasn't been tested.
|
11 |
+
|
12 |
If you are in a Linux or WSL environment, please refer to [huihui-ai/DeepSeek-R1-bf16](https://huggingface.co/huihui-ai/DeepSeek-R1-bf16).
|
13 |
+
|
14 |
If needed, we can upload the bf16 version.
|
15 |
|
16 |
## FP8 to BF16
|