Skip to content

ModuleNotFoundError: No module named 'transformers_modules.Qwen-7B-Chat' #10017

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
ILoveAmy opened this issue Jan 28, 2024 · 5 comments
Open

Comments

@ILoveAmy
Copy link

Run Qwen-7B-Chat model get the error: ModuleNotFoundError: No module named 'transformers_modules.Qwen-7B-Chat'

image

Reproduce Step:

  1. Download the Qwen-7B-Chat from: https://www.modelscope.cn/models/qwen/Qwen-7B-Chat/files

git clone https://www.modelscope.cn/qwen/Qwen-7B-Chat.git

  1. Setup the environment as
conda create -n llm python=3.9
conda activate llm

pip install bigdl-llm[all] # install bigdl-llm with 'all' option
pip install tiktoken einops transformers_stream_generator  # additional package required for Qwen-7B-Chat to conduct generation

refer to: https://github.com/intel-analytics/BigDL/tree/main/python/llm/example/CPU/HF-Transformers-AutoModels/Model/qwen

  1. Run the generate.py

python generate.py --repo-id-or-model-path d:\Qwen\Qwen-7B-Chat

Then, You will get the ERROR:
No module named 'transformers_modules.Qwen-7B-Chat'

@jason-dai
Copy link
Contributor

Looks like transformers version mismatch

@ILoveAmy
Copy link
Author

Looks like transformers version mismatch
Thank you @jason-dai !
Here is the version of transformers installed by the command: "pip install bigdl-llm[all]"

(qwen) C:\Users\OV>pip list
Package                       Version
----------------------------- ----------
accelerate                    0.21.0
bigdl-llm                     2.4.0
certifi                       2023.11.17
charset-normalizer            3.3.2
colorama                      0.4.6
einops                        0.7.0
filelock                      3.13.1
fsspec                        2023.12.2
huggingface-hub               0.20.3
idna                          3.6
intel-openmp                  2024.0.2
Jinja2                        3.1.3
MarkupSafe                    2.1.4
mpmath                        1.3.0
networkx                      3.2.1
numpy                         1.26.3
packaging                     23.2
pip                           23.3.1
protobuf                      4.25.2
psutil                        5.9.8
py-cpuinfo                    9.0.0
PyYAML                        6.0.1
regex                         2023.12.25
requests                      2.31.0
safetensors                   0.4.2
sentencepiece                 0.1.99
setuptools                    68.2.2
sympy                         1.12
tabulate                      0.9.0
tiktoken                      0.5.2
tokenizers                    0.13.3
torch                         2.1.2
tqdm                          4.66.1
transformers                  4.31.0
transformers-stream-generator 0.0.4
typing_extensions             4.9.0
urllib3                       2.1.0

I tried the Qwen-1_8B-Chat, it works
qwen1_8B

So, which version of transformers should I used?

@ILoveAmy
Copy link
Author

Looks like transformers version mismatch

I changed the version of transformers to 4. 32.0 according to 依赖项(Dependency)
image

The Error is still there: ModuleNotFoundError: No module named 'transformers_modules.Qwen-7B-Chat'
image

@ILoveAmy
Copy link
Author

[Problem Sloved]

There is a folder: xxxx.cache\huggingface\modules\transformers_modules, I changed the qwen-7b-chat to Qwen-7B-Chat, the problem solved.
image

The running result is:
image

@shane-huang
Copy link
Contributor

Since the problem is solved, can we close this issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants