Open
Description
Run Qwen-7B-Chat model get the error: ModuleNotFoundError: No module named 'transformers_modules.Qwen-7B-Chat'
Reproduce Step:
- Download the Qwen-7B-Chat from: https://www.modelscope.cn/models/qwen/Qwen-7B-Chat/files
- Setup the environment as
conda create -n llm python=3.9
conda activate llm
pip install bigdl-llm[all] # install bigdl-llm with 'all' option
pip install tiktoken einops transformers_stream_generator # additional package required for Qwen-7B-Chat to conduct generation
- Run the generate.py
python generate.py --repo-id-or-model-path d:\Qwen\Qwen-7B-Chat
Then, You will get the ERROR:
No module named 'transformers_modules.Qwen-7B-Chat'
Activity
jason-dai commentedon Jan 28, 2024
Looks like
transformers
version mismatchILoveAmy commentedon Jan 28, 2024
I tried the Qwen-1_8B-Chat, it works

So, which version of transformers should I used?
ILoveAmy commentedon Jan 28, 2024
I changed the version of transformers to 4. 32.0 according to 依赖项(Dependency)

The Error is still there: ModuleNotFoundError: No module named 'transformers_modules.Qwen-7B-Chat'

ILoveAmy commentedon Jan 28, 2024
[Problem Sloved]
There is a folder: xxxx.cache\huggingface\modules\transformers_modules, I changed the qwen-7b-chat to Qwen-7B-Chat, the problem solved.

The running result is:

shane-huang commentedon Jan 31, 2024
Since the problem is solved, can we close this issue?