Add extended description, tags, and improved README formatting
Browse filesThis project demonstrates how to fine-tune and deploy a lightweight NLP model for news classification. Although trained on a small sample dataset, the model generalizes basic categories reasonably well. You can further fine-tune it using larger and richer datasets such as ag_news.
Perfect for:
Students learning NLP
Fine-tuning transformers on custom tasks
Deploying small models with Hugging Face and π€ Transformers
README.md
CHANGED
@@ -1,30 +1,34 @@
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
language:
|
4 |
-
- en
|
5 |
metrics:
|
6 |
-
- accuracy
|
7 |
base_model:
|
8 |
-
- distilbert
|
9 |
pipeline_tag: text-classification
|
10 |
library_name: transformers
|
|
|
|
|
|
|
|
|
|
|
|
|
11 |
---
|
|
|
12 |
# π DistilBERT News Classifier by Dheeraj
|
13 |
|
14 |
A text classification model fine-tuned on news headlines to classify them into one of four categories:
|
15 |
|
16 |
-
-
|
17 |
-
-
|
18 |
-
-
|
19 |
-
-
|
20 |
-
|
21 |
-
---
|
22 |
|
23 |
## π§ Model Info
|
24 |
|
25 |
-
This model is
|
26 |
-
|
27 |
-
---
|
28 |
|
29 |
## π Example Usage
|
30 |
|
@@ -33,4 +37,4 @@ from transformers import pipeline
|
|
33 |
|
34 |
classifier = pipeline("text-classification", model="Dheeraj3103/distilbert-news-classifier-dheeraj")
|
35 |
result = classifier("NASA finds evidence of water on Mars.")
|
36 |
-
print(result)
|
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
language:
|
4 |
+
- en
|
5 |
metrics:
|
6 |
+
- accuracy
|
7 |
base_model:
|
8 |
+
- distilbert-base-uncased
|
9 |
pipeline_tag: text-classification
|
10 |
library_name: transformers
|
11 |
+
tags:
|
12 |
+
- text-classification
|
13 |
+
- news
|
14 |
+
- distilbert
|
15 |
+
- huggingface
|
16 |
+
- ag_news
|
17 |
---
|
18 |
+
|
19 |
# π DistilBERT News Classifier by Dheeraj
|
20 |
|
21 |
A text classification model fine-tuned on news headlines to classify them into one of four categories:
|
22 |
|
23 |
+
- `0 β Sports`
|
24 |
+
- `1 β Business`
|
25 |
+
- `2 β Tech`
|
26 |
+
- `3 β Science`
|
|
|
|
|
27 |
|
28 |
## π§ Model Info
|
29 |
|
30 |
+
This model is built using [DistilBERT](https://huggingface.co/distilbert-base-uncased) and fine-tuned using Hugging Face's `Trainer` API.
|
31 |
+
It is trained on a small dataset of sample headlines, inspired by AG News, for demonstration and educational purposes.
|
|
|
32 |
|
33 |
## π Example Usage
|
34 |
|
|
|
37 |
|
38 |
classifier = pipeline("text-classification", model="Dheeraj3103/distilbert-news-classifier-dheeraj")
|
39 |
result = classifier("NASA finds evidence of water on Mars.")
|
40 |
+
print(result)
|