很多huggingface的大语言模型都是pytorch的格式,但是mindie需要safetensor格式,另外mindieservice加载原始的baichuan2-13b的模型出错,后来排查是bfloat16数据格式的问题,所以这次转换要一次性转为float16的格式。
上代码:
import argparse
import os
import torch
def parse_arguments():
parser = argparse.ArgumentParser()
parser.add_argument('--model_path',
help="model and tokenizer path",
default='/data/acltransformer_testdata/weights/llama2/llama-2-70b',
)
return parser.parse_args()
def convert_bin2st_from_pretrained(model_path):
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained(
pretrained_model_name_or_path=model_path,
low_cpu_mem_usage=True,
trust_remote_code=True,
torch_dtype=torch.float16) #这里指定float16格式
#safe_serialization=True会保存为safetensor格式
model.save_pretrained(model_path, safe_serialization=True)
if name == 'main':
args = parse_arguments()
print(f"covert {args.model_path} into safetensor")
convert_bin2st_from_pretrained(args.model_path)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
使用方式:
python convert.py --model_path /home/model/Baichuan2-13B-Chat/
1
执行完毕后:
ls /home/model/Baichuan2-13B-Chat/ -lh
total 52G
-rw-r--r-- 1 root root 252K May 17 14:25 'Baichuan2 '$'\346\250\241\345\236\213\347\244\276\345\214\272\350\256\270\345\217\257\345\215\217\350\256\256''.pdf'
-rw-r--r-- 1 root root 199K May 17 14:25 'Community License for Baichuan2 Model.pdf'
-rw-r--r-- 1 root root 18K May 17 14:25 README.md
-rw-r--r-- 1 root root 797 Jun 24 10:37 config.json
-rw-r--r-- 1 root root 1.6K May 17 14:25 configuration_baichuan.py
-rw-r--r-- 1 root root 302 Jun 24 10:37 generation_config.json
-rw-r--r-- 1 root root 3.0K May 17 14:25 generation_utils.py
-rw-r--r-- 1 root root 1.1K May 17 14:25 handler.py
-rw-r----- 1 root root 9.3G Jun 24 10:38 model-00001-of-00003.safetensors
-rw-r----- 1 root root 9.3G Jun 24 10:38 model-00002-of-00003.safetensors
-rw-r----- 1 root root 7.4G Jun 24 10:38 model-00003-of-00003.safetensors
-rw-r----- 1 root root 23K Jun 24 10:38 model.safetensors.index.json
-rw-r--r-- 1 root root 33K May 17 14:25 modeling_baichuan.py
-rw-r--r-- 1 root root 9.3G May 17 14:28 pytorch_model-00001-of-00003.bin
-rw-r--r-- 1 root root 9.3G May 17 14:31 pytorch_model-00002-of-00003.bin
-rw-r--r-- 1 root root 7.4G May 17 14:33 pytorch_model-00003-of-00003.bin
-rw-r--r-- 1 root root 24K May 17 14:25 pytorch_model.bin.index.json
-rw-r--r-- 1 root root 9.2K May 17 14:25 quantizer.py
drwxr-xr-x 2 root root 4.0K Jun 24 10:25 safetensor_fp16
-rw-r--r-- 1 root root 574 May 17 14:25 special_tokens_map.json
-rw-r--r-- 1 root root 8.9K May 17 14:25 tokenization_baichuan.py
-rw-r--r-- 1 root root 2.0M May 17 14:25 tokenizer.model
-rw-r--r-- 1 root root 954 May 17 14:25 tokenizer_config.json
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
可以看到多了safetensor结尾的模型,原来bin后缀的也没删。
看下模型下的config.josn文件,可以看到torch_dtype已经变为float16格式,转换之前这里是bfloat16