Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -262,12 +262,12 @@ I'm experimenting with **function calling** against my network monitoring servic
|
|
262 |
|
263 |
### The other Available AI Assistants
|
264 |
|
265 |
-
π’ **TurboLLM** β Uses **gpt-4o-mini** Fast! . Note: tokens are limited since OpenAI models are pricey, but you can [Login](https://readyforquantum.com) or [Download](https://readyforquantum.com/download/?utm_source=huggingface&utm_medium=referral&utm_campaign=huggingface_repo_readme) the
|
266 |
|
267 |
π΅ **HugLLM** β Runs **open-source Hugging Face models** Fast, Runs small models (β8B) hence lower quality, Get 2x more tokens (subject to Hugging Face API availability)
|
268 |
|
269 |
### Final word
|
270 |
-
I fund the servers to create the models files, run the
|
271 |
This will help me pay for the services and increase the token limits for everyone.
|
272 |
|
273 |
Thank you :)
|
|
|
262 |
|
263 |
### The other Available AI Assistants
|
264 |
|
265 |
+
π’ **TurboLLM** β Uses **gpt-4o-mini** Fast! . Note: tokens are limited since OpenAI models are pricey, but you can [Login](https://readyforquantum.com) or [Download](https://readyforquantum.com/download/?utm_source=huggingface&utm_medium=referral&utm_campaign=huggingface_repo_readme) the Quantum Network Monitor agent to get more tokens, Alternatively use the TestLLM .
|
266 |
|
267 |
π΅ **HugLLM** β Runs **open-source Hugging Face models** Fast, Runs small models (β8B) hence lower quality, Get 2x more tokens (subject to Hugging Face API availability)
|
268 |
|
269 |
### Final word
|
270 |
+
I fund the servers to create the models files, run the Quantum Network Monitor Service and Pay for Inference from Novita and OpenAI all from my own pocket. All of the code for creating the models and the work I have done with Quantum Network Monitor is [open source](https://github.com/Mungert69). Feel free to use what you find useful. Please support my work and consider [buying me a coffee](https://www.buymeacoffee.com/mahadeva) .
|
271 |
This will help me pay for the services and increase the token limits for everyone.
|
272 |
|
273 |
Thank you :)
|