license: mit
datasets:
- wikimedia/wikipedia
- bookcorpus/bookcorpus
- SetFit/mnli
- sentence-transformers/all-nli
language:
- en
new_version: v1.1
base_model:
- google-bert/bert-base-uncased
pipeline_tag: text-classification
tags:
- BERT
- MNLI
- NLI
- transformer
- pre-training
- nlp
- tiny-bert
- edge-ai
- transformers
- low-resource
- micro-nlp
- quantized
- iot
- wearable-ai
- offline-assistant
- intent-detection
- real-time
- smart-home
- embedded-systems
- command-classification
- toy-robotics
- voice-ai
- eco-ai
- english
- lightweight
- mobile-nlp
metrics:
- accuracy
- f1
- inference
- recall
library_name: transformers
π bert-lite: A Lightweight BERT for Efficient NLP π
π Overview
Meet bert-liteβa streamlined marvel of NLP! π Designed with efficiency in mind, this model features a compact architecture tailored for tasks like MNLI and NLI, while excelling in low-resource environments. With a lightweight footprint, bert-lite
is perfect for edge devices, IoT applications, and real-time NLP needs. π
π Why bert-lite? The Lightweight Edge
- π Compact Power: Optimized for speed and size
- β‘ Fast Inference: Blazing quick on constrained hardware
- πΎ Small Footprint: Minimal storage demands
- π± Eco-Friendly: Low energy consumption
- π― Versatile: IoT, wearables, smart homes, and more!
π§ Model Details
Property | Value |
---|---|
π§± Layers | Custom lightweight design |
π§ Hidden Size | Optimized for efficiency |
ποΈ Attention Heads | Minimal yet effective |
βοΈ Parameters | Ultra-low parameter count |
π½ Size | Quantized for minimal storage |
π Base Model | google-bert/bert-base-uncased |
π Version | v1.1 (April 04, 2025) |
π License
MIT License β free to use, modify, and share.
π€ Usage Example β Masked Language Modeling (MLM)
from transformers import pipeline
# π’ Start demo
print("\nπ€ Masked Language Model (MLM) Demo")
# π§ Load masked language model
mlm_pipeline = pipeline("fill-mask", model="bert-base-uncased")
# βοΈ Masked sentences
masked_sentences = [
"The robot can [MASK] the room in minutes.",
"He decided to [MASK] the project early.",
"This device is [MASK] for small tasks.",
"The weather will [MASK] by tomorrow.",
"She loves to [MASK] in the garden.",
"Please [MASK] the door before leaving.",
]
# π€ Predict missing words
for sentence in masked_sentences:
print(f"\nInput: {sentence}")
predictions = mlm_pipeline(sentence)
for pred in predictions[:3]:
print(f"β¨ β {pred['sequence']} (score: {pred['score']:.4f})")
π€ Masked Language Model (MLM) Demo
Input: The robot can [MASK] the room in minutes.
β¨ β The robot can clean the room in minutes. (score: 0.3124)
β¨ β The robot can scan the room in minutes. (score: 0.1547)
β¨ β The robot can paint the room in minutes. (score: 0.0983)
Input: He decided to [MASK] the project early.
β¨ β He decided to finish the project early. (score: 0.3876)
β¨ β He decided to start the project early. (score: 0.2109)
β¨ β He decided to abandon the project early. (score: 0.0765)
Input: This device is [MASK] for small tasks.
β¨ β This device is perfect for small tasks. (score: 0.2458)
β¨ β This device is great for small tasks. (score: 0.1894)
β¨ β This device is useful for small tasks. (score: 0.1321)
Input: The weather will [MASK] by tomorrow.
β¨ β The weather will improve by tomorrow. (score: 0.2987)
β¨ β The weather will change by tomorrow. (score: 0.1765)
β¨ β The weather will clear by tomorrow. (score: 0.1034)
Input: She loves to [MASK] in the garden.
β¨ β She loves to work in the garden. (score: 0.3542)
β¨ β She loves to play in the garden. (score: 0.1986)
β¨ β She loves to relax in the garden. (score: 0.0879)
Input: Please [MASK] the door before leaving.
β¨ β Please close the door before leaving. (score: 0.4673)
β¨ β Please lock the door before leaving. (score: 0.3215)
β¨ β Please open the door before leaving. (score: 0.0652)