1 Star 0 Fork 0

贺辉0912/bigmodel

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
文件
克隆/下载
chatglm02.py 1.56 KB
一键复制 编辑 原始数据 按行查看 历史
andy.he 提交于 2024-03-02 17:18 . 1
from transformers import AutoTokenizer, AutoModel
import time
from langchain_community.llms import HuggingFacePipeline
from transformers import pipeline
from langchain.prompts import PromptTemplate
from langchain_community.llms import Tongyi
import ChatGLM
# import os
# os.environ["DASHSCOPE_API_KEY"] = "sk-cc1c8314fdbd43ceaf26ec1824d5dd3b"
# model = Tongyi()
# model= ChatGLM.ChatGLM_LLM()
# tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm3-6b", trust_remote_code=True)
# model = AutoModel.from_pretrained("THUDM/chatglm3-6b", trust_remote_code=True).half().cuda()
# model = model.eval()
pipeline = pipeline("text-generation",
model="THUDM/chatglm3-6b",
# device="cuda:0",
# trust_remote_code=True
)
print(pipeline("我今天考了"))
# model = HuggingFacePipeline(pipeline=pipeline)
# template = """Question: {question}
# Answer: Let's think step by step."""
# prompt = PromptTemplate.from_template(template)
# chain = prompt | model
# question = "What is the result of 1+ 1?"
# print(chain.invoke({"question": question}))
# response, history = model.chat(tokenizer, "你好", history=[])
# print(response)
# response, history = model.chat(tokenizer, "晚上睡不着应该怎么办", history=history)
# print(response)
# 使用 range 函数生成一个从0到9的整数序列,共10个数字
# for i in range(10):
# # 在这里写入需要循环执行的代码
# a = time.time()
# response = chain.invoke({"question": "我今天考了一百分"})
# print(response)
# print(time.time()-a)
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
Python
1
https://gitee.com/he-hui-0912/bigmodel.git
git@gitee.com:he-hui-0912/bigmodel.git
he-hui-0912
bigmodel
bigmodel
master

搜索帮助

0d507c66 1850385 C8b1a773 1850385