自学内容网 自学内容网

ollama serve 部署 支持 Open AI API

安装

官网说明 Download Ollama on Linux

命令:

# 官方命令
curl -fsSL https://ollama.com/install.sh | sh

# 大陆可能没法安装,需要用代理
# 先下载
curl -fsSL https://ollama.com/install.sh 

# 修改 install.sh ,所有 curl 的地方都加代理
curl -x http://192.168.0.77:18808  xxxx

# 再执行
./install.sh


# 注意:Ubuntu 系统里 直接用 apt install 不是最新版本,有些命令没法用

启动

base_dir=/data/work/chatglm3/
cd $base_dir

. colors

pythonPath=${base_dir}venv/bin/python
pythonActivity=${base_dir}venv/bin/activate

jobName=ollama
TAILPID=`ps aux | grep "$jobName" | grep -v grep | awk '{print $2}'`
echo "${YELLOW}check $jobName pid $TAILPID ${NOCOLOR}"
[ "0$TAILPID" != "0" ] && sudo kill -9 $TAILPID


source ${pythonActivity}

export http_proxy=192.168.0.77:18808
export https_proxy=192.168.0.77:18808

export OLLAMA_HOST=0.0.0.0:9191
export OLLAMA_DEBUG=1

# https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html#vllm-serve
cmdrun="ollama serve"

# OLLAMA_HOST=0.0.0.0:9191 OLLAMA_DEBUG=1  ollama serve

echo "${YELLOW}$cmdrun${NOCOLOR}"
nohup ${cmdrun} > logs/ollama.logs 2>&1 &

# docker pull 也跟端口有关系。端口如果修改,则需要重新pull
# cmdpull="OLLAMA_HOST=0.0.0.0:9191 OLLAMA_DEBUG=1 ollama pull qwen2.5:7b-instruct-fp16"
# echo "${YELLOW}$cmdpull${NOCOLOR}"
# nohup ${cmdpull}

# export http_proxy=http://192.168.0.77:18808 && export https_proxy=http://192.168.0.77:18808
# export HTTP_PROXY=http://192.168.0.77:18808 && export HTTPS_PROXY=http://192.168.0.77:18808
# unset http_proxy && unset https_proxy
# unset HTTP_PROXY && unset HTTPS_PROXY

注意

# 注意 ollama 的各种配置和 OLLAMA_HOST 强关联

# 默认端口 11434
ollama list

# 改为9191
OLLAMA_HOST=0.0.0.0:9191 ollama list

启动服务后,默认就支持 OpenAI API 服务

https://github.com/ollama/ollama/blob/main/docs/openai.md

参考:

https://juejin.cn/post/7340197367515414538

postman ollama 接口

https://www.postman.com/postman-student-programs/ollama-api/collection/suc47x8/ollama-rest-api


原文地址:https://blog.csdn.net/linzhiji/article/details/145207435

免责声明:本站文章内容转载自网络资源,如本站内容侵犯了原著者的合法权益,可联系本站删除。更多内容请关注自学内容网(zxcms.com)!