开发者社区> 问答> 正文

ModelScope报错,这个又是什么问题呢?

ModelScope报错,这个又是什么问题呢? RemoteTraceback Traceback (most recent call last) RemoteTraceback: """ Traceback (most recent call last): File "/opt/conda/lib/python3.7/multiprocessing/pool.py", line 121, in worker result = (True, func(*args, **kwds)) File "/opt/conda/lib/python3.7/multiprocessing/pool.py", line 44, in mapstar return list(map(*args)) File "/opt/conda/lib/python3.7/site-packages/modelscope/pipelines/nlp/distributed_gpt3_pipeline.py", line 39, in _instantiate_one cls.model = DistributedGPT3(model_dir, rank, **kwargs) File "/opt/conda/lib/python3.7/site-packages/modelscope/models/nlp/gpt3/distributed_gpt3.py", line 1010, in init load_model = pre_load(ckpt_rank, model_dir, tag=path_load_tag) File "/opt/conda/lib/python3.7/site-packages/modelscope/utils/nlp/load_checkpoint.py", line 68, in pre_load load_path, map_location=lambda storage, loc: storage) File "/opt/conda/lib/python3.7/site-packages/torch/serialization.py", line 699, in load with _open_file_like(f, 'rb') as opened_file: File "/opt/conda/lib/python3.7/site-packages/torch/serialization.py", line 231, in _open_file_like return _open_file(name_or_buffer, mode) File "/opt/conda/lib/python3.7/site-packages/torch/serialization.py", line 212, in init super(_open_file, self).init(open(name, mode)) FileNotFoundError: [Errno 2] No such file or directory: '/mnt/workspace/.cache/modelscope/damo/nlp_gpt3_text-generation_13B/model/mp_rank_00_model_states.pt' """

The above exception was the direct cause of the following exception:

FileNotFoundError Traceback (most recent call last) /opt/conda/lib/python3.7/site-packages/modelscope/utils/registry.py in build_from_cfg(cfg, registry, group_key, default_args) 211 else: --> 212 return obj_cls(**args) 213 except Exception as e:

/opt/conda/lib/python3.7/site-packages/modelscope/pipelines/nlp/distributed_gpt3_pipeline.py in init(self, model, preprocessor, **kwargs) 33 preprocessor = TextGenerationJiebaPreprocessor(model) ---> 34 super().init(model, preprocessor=preprocessor, **kwargs) 35 assert hasattr(preprocessor, 'tokenizer')

/opt/conda/lib/python3.7/site-packages/modelscope/pipelines/base.py in init(self, model, preprocessor, auto_collate, **kwargs) 424 **self.cfg.model, --> 425 **kwargs), ranks) 426 self.models = []

/opt/conda/lib/python3.7/multiprocessing/pool.py in map(self, func, iterable, chunksize) 267 ''' --> 268 return self._map_async(func, iterable, mapstar, chunksize).get() 269

/opt/conda/lib/python3.7/multiprocessing/pool.py in get(self, timeout) 656 else: --> 657 raise self._value 658

FileNotFoundError: [Errno 2] No such file or directory: '/mnt/workspace/.cache/modelscope/damo/nlp_gpt3_text-generation_13B/model/mp_rank_00_model_states.pt'

During handling of the above exception, another exception occurred:

FileNotFoundError Traceback (most recent call last) /tmp/ipykernel_457/4012221511.py in 4 input = '写一个python的快排代码' 5 model_id = 'damo/nlp_gpt3_text-generation_13B' ----> 6 pipe = pipeline(Tasks.text_generation, model=model_id) 7 8 print(pipe(input))

/opt/conda/lib/python3.7/site-packages/modelscope/pipelines/builder.py in pipeline(task, model, preprocessor, config_file, pipeline_name, framework, device, model_revision, **kwargs) 350 cfg.preprocessor = preprocessor 351 --> 352 return build_pipeline(cfg, task_name=task) 353 354

/opt/conda/lib/python3.7/site-packages/modelscope/pipelines/builder.py in build_pipeline(cfg, task_name, default_args) 268 """ 269 return build_from_cfg( --> 270 cfg, PIPELINES, group_key=task_name, default_args=default_args) 271 272

/opt/conda/lib/python3.7/site-packages/modelscope/utils/registry.py in build_from_cfg(cfg, registry, group_key, default_args) 213 except Exception as e: 214 # Normal TypeError does not print class name. --> 215 raise type(e)(f'{obj_cls.name}: {e}')

FileNotFoundError: DistributedGPT3Pipeline: [Errno 2] No such file or directory: '/mnt/workspace/.cache/modelscope/damo/nlp_gpt3_text-generation_13B/model/mp_rank_00_model_states.pt'

展开
收起
爱喝咖啡嘿 2022-12-22 12:48:20 476 0
1 条回答
写回答
取消 提交回答
  • 看 log 您运行的是我们的 13B 模型,这个模型因为参数量较大目前还未开放下载,您可以先试用 1.3B/2.7B 参数量的模型,或是在前端界面右侧的 demo 中试用 13B 大模型的生成效果——此答案整理自钉群“魔搭ModelScope开发者联盟群 ①”

    2022-12-22 15:35:31
    赞同 展开评论 打赏
问答分类:
问答标签:
来源圈子
更多
收录在圈子:
+ 订阅
技术图谱:由专家组参与技术图谱的绘制与编写,知识与实践的结合让开发者们掌握学习路线与逻辑,快速提升技能 电子书:电子书由阿里内外专家打造,供开发者们下载学习,更与课程相结合,使用户更易理解掌握课程内容 训练营:学习训练营 深入浅出,专家授课,带领开发者们快速上云 精品课程:汇集知识碎片,解决技术难题,体系化学习场景,深入浅出,易于理解 技能自测:提供免费测试,摸底自查 体验实验室:学完即练,云资源免费使用
问答排行榜
最热
最新

相关电子书

更多
视觉AI能力的开放现状及ModelScope实战 立即下载
ModelScope助力语音AI模型创新与应用 立即下载
低代码开发师(初级)实战教程 立即下载