Add support for transformers>=4.49

#12
No description provided.

Add support for transformers>=4.49

Kaixuanliu changed pull request title from Update modeling_cogvlm.py to Add support for transformers>=4.49

Hi, can you help review?

@zRzRzRzRzRzRzR , could you help review? thx. @Kaixuanliu , you may need explain a bit on what you changed, the diff seems a bit more which I believe is because the diff tool is not so good.

Yes, the only change I made is add a special process for transformers>=4.49.0 in function _update_model_kwargs_for_generation, not affect other parts. We met this issue and I noticed someone else mentioned this problem in the discussion https://huggingface.co/THUDM/cogvlm2-llama3-caption/discussions/8

It's a good suggestion to add this line of explanation to the readme (it should be 4.51 now), your changes are significant, just modify the necessary code.

Sorry, I am not sure what do you mean. You mean add related comments to the file I changed? Can you merge this one and then add related doc/comment yourself? Thx

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment