Lora 模型加载合并

332 阅读1分钟

1、导入相关包

from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel

2、加载基础模型

model = AutoModelForCausalLM.from_pretrained("Langboat/bloom-1b4-zh")
tokenizer = AutoTokenizer.from_pretrained("Langboat/bloom-1b4-zh")

3、加载Lora模型

p_model = PeftModel.from_pretrained(model, model_id="./chatbot/checkpoint-500/")

4、模型推理

ipt = tokenizer("Human: {}\n{}".format("考试有哪些技巧?", "").strip() + "\n\nAssistant: ", return_tensors="pt")
tokenizer.decode(p_model.generate(**ipt, do_sample=False)[0], skip_special_tokens=True)

5、模型合并

merge_model = p_model.merge_and_unload()

模型推理

ipt = tokenizer("Human: {}\n{}".format("考试有哪些技巧?", "").strip() + "\n\nAssistant: ", return_tensors="pt")
tokenizer.decode(merge_model.generate(**ipt, do_sample=False)[0], skip_special_tokens=True)

6、模型保存

merge_model.save_pretrained("./chatbot/merge_model")