Text Classification
Transformers
Safetensors
English
bert
fill-mask
BERT
MNLI
NLI
transformer
pre-training
nlp
tiny-bert
edge-ai
low-resource
micro-nlp
quantized
iot
wearable-ai
offline-assistant
intent-detection
real-time
smart-home
embedded-systems
command-classification
toy-robotics
voice-ai
eco-ai
english
lightweight
mobile-nlp
File size: 4,681 Bytes
4f434a7 5fe98b9 4f434a7 b685ddb 4f434a7 b685ddb 1200629 b685ddb 1200629 0e871aa 81f1b07 0e871aa |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 |
---
license: mit
datasets:
- wikimedia/wikipedia
- bookcorpus/bookcorpus
- SetFit/mnli
- sentence-transformers/all-nli
language:
- en
new_version: v1.1
base_model:
- google-bert/bert-base-uncased
pipeline_tag: text-classification
tags:
- BERT
- MNLI
- NLI
- transformer
- pre-training
- nlp
- tiny-bert
- edge-ai
- transformers
- low-resource
- micro-nlp
- quantized
- iot
- wearable-ai
- offline-assistant
- intent-detection
- real-time
- smart-home
- embedded-systems
- command-classification
- toy-robotics
- voice-ai
- eco-ai
- english
- lightweight
- mobile-nlp
metrics:
- accuracy
- f1
- inference
- recall
library_name: transformers
---

# π bert-lite: A Lightweight BERT for Efficient NLP π
## π Overview
Meet **bert-lite**βa streamlined marvel of NLP! π Designed with efficiency in mind, this model features a compact architecture tailored for tasks like **MNLI** and **NLI**, while excelling in low-resource environments. With a lightweight footprint, `bert-lite` is perfect for edge devices, IoT applications, and real-time NLP needs. π
---
## π Why bert-lite? The Lightweight Edge
- π **Compact Power**: Optimized for speed and size
- β‘ **Fast Inference**: Blazing quick on constrained hardware
- πΎ **Small Footprint**: Minimal storage demands
- π± **Eco-Friendly**: Low energy consumption
- π― **Versatile**: IoT, wearables, smart homes, and more!
---
## π§ Model Details
| Property | Value |
|-------------------|------------------------------------|
| π§± Layers | Custom lightweight design |
| π§ Hidden Size | Optimized for efficiency |
| ποΈ Attention Heads | Minimal yet effective |
| βοΈ Parameters | Ultra-low parameter count |
| π½ Size | Quantized for minimal storage |
| π Base Model | google-bert/bert-base-uncased |
| π Version | v1.1 (April 04, 2025) |
---
## π License
MIT License β free to use, modify, and share.
## π€ Usage Example β Masked Language Modeling (MLM)
```python
from transformers import pipeline
# π’ Start demo
print("\nπ€ Masked Language Model (MLM) Demo")
# π§ Load masked language model
mlm_pipeline = pipeline("fill-mask", model="bert-base-uncased")
# βοΈ Masked sentences
masked_sentences = [
"The robot can [MASK] the room in minutes.",
"He decided to [MASK] the project early.",
"This device is [MASK] for small tasks.",
"The weather will [MASK] by tomorrow.",
"She loves to [MASK] in the garden.",
"Please [MASK] the door before leaving.",
]
# π€ Predict missing words
for sentence in masked_sentences:
print(f"\nInput: {sentence}")
predictions = mlm_pipeline(sentence)
for pred in predictions[:3]:
print(f"β¨ β {pred['sequence']} (score: {pred['score']:.4f})")
```
---
## π€ Masked Language Model (MLM) Demo
Input: The robot can [MASK] the room in minutes.
β¨ β The robot can clean the room in minutes. (score: 0.3124)
β¨ β The robot can scan the room in minutes. (score: 0.1547)
β¨ β The robot can paint the room in minutes. (score: 0.0983)
Input: He decided to [MASK] the project early.
β¨ β He decided to finish the project early. (score: 0.3876)
β¨ β He decided to start the project early. (score: 0.2109)
β¨ β He decided to abandon the project early. (score: 0.0765)
Input: This device is [MASK] for small tasks.
β¨ β This device is perfect for small tasks. (score: 0.2458)
β¨ β This device is great for small tasks. (score: 0.1894)
β¨ β This device is useful for small tasks. (score: 0.1321)
Input: The weather will [MASK] by tomorrow.
β¨ β The weather will improve by tomorrow. (score: 0.2987)
β¨ β The weather will change by tomorrow. (score: 0.1765)
β¨ β The weather will clear by tomorrow. (score: 0.1034)
Input: She loves to [MASK] in the garden.
β¨ β She loves to work in the garden. (score: 0.3542)
β¨ β She loves to play in the garden. (score: 0.1986)
β¨ β She loves to relax in the garden. (score: 0.0879)
Input: Please [MASK] the door before leaving.
β¨ β Please close the door before leaving. (score: 0.4673)
β¨ β Please lock the door before leaving. (score: 0.3215)
β¨ β Please open the door before leaving. (score: 0.0652)
|