自学内容网 自学内容网

使用 Ollama 时遇到的问题

题意:

ImportError: cannot import name 'Ollama' from 'llama_index.llms' (unknown location) - installing dependencies does not solve the problem

Python 无法从 llama_index.llms 模块中导入名为 Ollama 的类或函数

问题背景:

I want to learn LLMs. I run Ollama with the following Docker Compose file - it's running:

我想学习大型语言模型(LLMs)。我使用以下Docker Compose文件运行Ollama,并且它正在运行:

services:
  ollama:
    image: ollama/ollama:latest
    ports:
      - 11434:11434
    volumes:
      - ollama_data:/root/.ollama
    healthcheck:
      test: ollama list || exit 1
      interval: 10s
      timeout: 30s
      retries: 5
      start_period: 10s
  ollama-models-pull:
    image: curlimages/curl:8.6.0
    command: >-
      http://ollama:11434/api/pull -d '{"name": "mistral"}'
    depends_on:
      ollama:
        condition: service_healthy
volumes:
  ollama_data:

I would like to write a Python app, which will use ollama, and I found this piece of code:

我想编写一个使用Ollama的Python应用程序,我找到了以下代码片段:

from llama_index.llms import Ollama, ChatMessage

llm = Ollama(model="mistral", base_url="http://127.0.0.1:11434")

messages = [
    ChatMessage(
        role="system", content="you are a multi lingual assistant used for translation and your job is to translate nothing more than that."
    ),
    ChatMessage(
        role="user", content="please translate message in triple tick to french ``` What is standard deviation?```"
    )
]
resp = llm.chat(messages=messages)
print(resp)

I installed all dependencies:        我安装了所有依赖

python3 -m venv venv
source venv/bin/activate
pip install llama-index  
pip install llama-index-llms-ollama
pip install ollama-python

However, when I run the app, I got:        当我运行app时,得到以下信息:

Traceback (most recent call last):
  File "/home/user/test.py", line 1, in <module>
    from llama_index.llms import Ollama, ChatMessage
ImportError: cannot import name 'Ollama' from 'llama_index.llms' (unknown location)

where can be the problem?        问题可能在哪里?

问题解决:

The correct way to import Ollama should be:        引入 ollama 的正确方式:

from llama_index.llms.ollama import Ollama

For ChatMessage it should be:  引入 ChatMessage 的方式:

from llama_index.core.llms import ChatMessage


原文地址:https://blog.csdn.net/suiusoar/article/details/140163195

免责声明:本站文章内容转载自网络资源,如本站内容侵犯了原著者的合法权益,可联系本站删除。更多内容请关注自学内容网(zxcms.com)!