将python项目集成到ollama
今天带大家来学习如何将Python 项目集成到Ollama的最简单方法,Ollama Python库就提供了这个功能。
我们直接来看
准备工作
安装ollama环境
pip install ollama
python环境
python 3.8+
如何使用
这里提供一个简单的示例demo
import ollama
response = ollama.chat(model='llama2', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
},
])
print(response['message']['content'])
输出结果
The sky appears blue due to Rayleigh scattering. Rayleigh scattering is the scattering of light by particles of a shorter wavelength, such as blue light. This means that blue light is scattered more than other colors, which are scattered equally in all directions by larger particles. The blue light reaches our eyes from all directions in the sky, making it appear blue.
流式响应
可以通过设置参数stream=True来启用流式响应,修改函数调用以返回一个Python生成器,其中每个部分都是流中的一个对象。
from ollama import chat
messages = [
{
'role': 'user',
'content': 'hello',
},
]
for part in chat('gemma:2b', messages=messages, stream=True):
print(part['message']['content'], end='', flush=True)
# end with a newline
print()
输出结果
Hello! 👋 It's great to hear from you. What can I do for you today?
Chat
from ollama import chat
messages = [
{
'role': 'user',
'content': 'Why is the sky blue?',
},
]
response = chat('gemma:2b', messages=messages)
print(response['message']['content'])
输出结果
Errors
如果请求返回错误状态或在流式传输过程中检测到错误,则会引发错误。
model = 'does-not-yet-exist'
try:
ollama.chat(model)
except ollama.ResponseError as e:
print('Error:', e.error)
if e.status_code == 404:
ollama.pull(model)
Was this helpful?
0 / 0