ModelScope要怎样才能让模型用本地的?在用vllm跑通义千问的时候,模型我已经下载到本地了,但是每次运行模型都试图去网上重新下载2023-12-21 20:43:52,734 - modelscope - INFO - PyTorch version 2.1.2+cu118 Found.
2023-12-21 20:43:52,735 - modelscope - INFO - Loading ast index from /mnt/workspace/.cache/modelscope/ast_indexer
2023-12-21 20:43:52,776 - modelscope - INFO - Loading done! Current index file version is 1.10.0, with md5 a8ea39d711014973295fd151a3c1d37d and a total number of 946 components indexed
2023-12-21 20:43:53,664 - modelscope - ERROR - Authentication token does not exist, failed to access model /mnt/workspace/algteam/xufeng/models/qwen/Qwen-72B-Chat-Int4 which may not exist or may be private. Please login first.
Traceback (most recent call last):
File "/opt/conda/envs/vllm2/lib/python3.9/site-packages/modelscope/hub/errors.py", line 91, in handle_http_response
response.raise_for_status()
File "/opt/conda/envs/vllm2/lib/python3.9/site-packages/requests/models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 404 Client Error: Not Found for url:
用的官方的代码,只是改了模型路径,显然模型费哟冲线上去下载,但是我已经下载到本地了