-
Notifications
You must be signed in to change notification settings - Fork 1.3k
ModuleNotFoundError: No module named 'transformers_modules.Qwen-7B-Chat' #10017
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Looks like |
I tried the Qwen-1_8B-Chat, it works So, which version of transformers should I used? |
I changed the version of transformers to 4. 32.0 according to 依赖项(Dependency) The Error is still there: ModuleNotFoundError: No module named 'transformers_modules.Qwen-7B-Chat' |
Since the problem is solved, can we close this issue? |
Run Qwen-7B-Chat model get the error: ModuleNotFoundError: No module named 'transformers_modules.Qwen-7B-Chat'
Reproduce Step:
refer to: https://github.com/intel-analytics/BigDL/tree/main/python/llm/example/CPU/HF-Transformers-AutoModels/Model/qwen
Then, You will get the ERROR:
No module named 'transformers_modules.Qwen-7B-Chat'
The text was updated successfully, but these errors were encountered: