Text Classification
Transformers
Safetensors
English
bert
fill-mask
BERT
MNLI
NLI
transformer
pre-training
nlp
tiny-bert
edge-ai
low-resource
micro-nlp
quantized
iot
wearable-ai
offline-assistant
intent-detection
real-time
smart-home
embedded-systems
command-classification
toy-robotics
voice-ai
eco-ai
english
lightweight
mobile-nlp
Update README.md
Browse files
README.md
CHANGED
@@ -210,7 +210,7 @@ Input: Please [MASK] the door before leaving.
|
|
210 |
- **Training Data** ๐: Trained on Wikipedia, BookCorpus, MNLI, and sentence-transformers/all-nli for broad and specialized NLP strength.
|
211 |
- **Key Strength** ๐ช: Combines extreme efficiency with balanced performance, perfect for edge and general NLP tasks.
|
212 |
- **Use Cases** ๐ฏ: Versatile across IoT ๐, wearables โ, smart homes ๐ , and moderate hardware, supporting real-time and offline applications.
|
213 |
-
- **Accuracy** โ
: Competitive with larger models, achieving ~90-
|
214 |
- **Contextual Understanding** ๐: Strong bidirectional context, adept at disambiguating meanings in real-world scenarios.
|
215 |
- **License** ๐: MIT License (or Apache 2.0 compatible), free to use, modify, and share for all users.
|
216 |
- **Release Context** ๐: v1.1, released April 04, 2025, reflecting cutting-edge lightweight design.
|
@@ -245,7 +245,7 @@ Input: Please [MASK] the door before leaving.
|
|
245 |
- MIT License offers unrestricted freedom to use, modify, and share, slightly more permissive than `bert-mini`โs typical Apache 2.0.
|
246 |
|
247 |
- **Competitive Accuracy** โ
|
248 |
-
- Matches `bert-mini`โs ~90-
|
249 |
|
250 |
- **Future-Ready** ๐
|
251 |
- Built for the next wave of AIโthink IoT and real-time NLPโmaking it more forward-looking than the general-purpose `bert-mini`.
|
|
|
210 |
- **Training Data** ๐: Trained on Wikipedia, BookCorpus, MNLI, and sentence-transformers/all-nli for broad and specialized NLP strength.
|
211 |
- **Key Strength** ๐ช: Combines extreme efficiency with balanced performance, perfect for edge and general NLP tasks.
|
212 |
- **Use Cases** ๐ฏ: Versatile across IoT ๐, wearables โ, smart homes ๐ , and moderate hardware, supporting real-time and offline applications.
|
213 |
+
- **Accuracy** โ
: Competitive with larger models, achieving ~90-97% of BERT-baseโs performance (task-dependent).
|
214 |
- **Contextual Understanding** ๐: Strong bidirectional context, adept at disambiguating meanings in real-world scenarios.
|
215 |
- **License** ๐: MIT License (or Apache 2.0 compatible), free to use, modify, and share for all users.
|
216 |
- **Release Context** ๐: v1.1, released April 04, 2025, reflecting cutting-edge lightweight design.
|
|
|
245 |
- MIT License offers unrestricted freedom to use, modify, and share, slightly more permissive than `bert-mini`โs typical Apache 2.0.
|
246 |
|
247 |
- **Competitive Accuracy** โ
|
248 |
+
- Matches `bert-mini`โs ~90-97% of BERT-base performance, but with a custom design that excels in edge-specific tasks like NLI.
|
249 |
|
250 |
- **Future-Ready** ๐
|
251 |
- Built for the next wave of AIโthink IoT and real-time NLPโmaking it more forward-looking than the general-purpose `bert-mini`.
|