The model is not working with vLLM or transformers library as well
This model is throwing ZeroDivisionError: integer modulo by zero
while running with vLLM & TypeError: '>=' not supported between instances of 'Tensor' and 'NoneType'
while running with transformer repo
vLLM Detailed Error
Transformer Detailed Error:
I have the exactly same error with transformer.
Hey @divyanshusingh , thanks for reporting the issue!
vLLM support is still in progress in the public package, but the changes where merged to main already: https://github.com/vllm-project/vllm/pull/17315 . A new version should be out by the end of the week.
On transformers, could you share which version of the library you are using?
Thanks
Hi
@betodepaola
, thank you so much for the quick response. I'm using the 4.52.0.dev0
version of transformers.
Hi
@divyanshusingh
, for transformers you may install from main
, or use the following stable release: https://github.com/huggingface/transformers/releases/tag/v4.51.3-LlamaGuard-preview