error running this model on vllm 0.10.2

#1
by koushd - opened

vllm-distributed | ERROR 09-15 12:28:53 [launch.py:516] Error occurred while running method 'determine_available_memory': CUDA error: no kernel image is available for execution on the device

rtx 6000 pro

Sign up or log in to comment