You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Overriding torch_dtype=None with torch_dtype=torch.float16 due to requirements of bitsandbytes to enable model loading in mixed int8. Either pass torch_dtype=torch.float16 or don't pass this argument at all to remove this warning.
#235
Open
sanwei111 opened this issue
May 19, 2023
· 0 comments
我用in8,需要改torch_dtype参数吗
The text was updated successfully, but these errors were encountered: