ModelScope魔搭模型如何搭建webapi呢?用flask调用模型出错,而直接运行就不会
lask理论上也不会,之前用百川的模型写过一个case,import torch
from flask import Flask, request
from modelscope import snapshot_download, Model
app = Flask(name)
model_dir = snapshot_download("baichuan-inc/Baichuan-13B-Chat", revision='v1.0.3')
model = Model.from_pretrained(model_dir, device_map="balanced", trust_remote_code=True, torch_dtype=torch.float16)
@app.route("/")
def hello_world():
return "
Hello, world!
"@app.route("/ask", methods=['POST','GET'])
def inference():
question = request.args["question"]
messages = []
messages.append({"role": "user", "content": question})
response = model(messages)
print(response)
return "
%s
" % responseapp.run("0.0.0.0", "8080", True)供参考,此回答整理自钉群“魔搭ModelScope开发者联盟群 ①”