|
--- |
|
license: apache-2.0 |
|
base_model: |
|
- TIGER-Lab/VL-Rethinker-7B |
|
base_model_relation: quantized |
|
pipeline_tag: visual-question-answering |
|
tags: |
|
- chat |
|
- mlx |
|
- apple |
|
- 4bit |
|
- multimodal |
|
language: |
|
- en |
|
library_name: mlx |
|
--- |
|
# VL-Rethinker-7B 4-bit MLX |
|
|
|
This model was converted to MLX format from [`TIGER-Lab/VL-Rethinker-7B`](https://huggingface.co/TIGER-Lab/VL-Rethinker-7B) using mlx-vlm version **0.1.23**. |
|
|
|
Refer to the [original model card](https://huggingface.co/TIGER-Lab/VL-Rethinker-7B) and [**📖Paper**](https://arxiv.org/abs/2504.08837) for more details on the model. |
|
|
|
|
|
## Use with mlx |
|
|
|
```bash |
|
pip install -U mlx-vlm |
|
``` |
|
|
|
```bash |
|
python -m mlx_vlm.generate --model TheCluster/VL-Rethinker-7B-mlx-4bit --max-tokens 512 --temperature 0.0 --prompt "Describe this image." --image <path_to_image> |
|
``` |