optimize ollama docs
parent
4e9ea6232f
commit
3dd7c1ac80
|
|
@ -34,7 +34,9 @@ Dify supports integrating LLM and Text Embedding capabilities of large language
|
|||
- Base URL: `http://<your-ollama-endpoint-domain>:11434`
|
||||
|
||||
Enter the base URL where the Ollama service is accessible.
|
||||
|
||||
If Dify is deployed using docker, consider using the local network IP address, e.g., `http://192.168.1.100:11434` or the docker host machine IP address, e.g., `http://172.17.0.1:11434`.
|
||||
|
||||
For local source code deployment, use `http://localhost:11434`.
|
||||
|
||||
- Model Type: Chat
|
||||
|
|
|
|||
|
|
@ -34,7 +34,9 @@ Dify 支持接入 Ollama 部署的大型语言模型推理和 embedding 能力
|
|||
- 基础 URL:`http://<your-ollama-endpoint-domain>:11434`
|
||||
|
||||
此处需填写可访问到的 Ollama 服务地址。
|
||||
|
||||
若 Dify 为 docker 部署,建议填写局域网 IP 地址,如:`http://192.168.1.100:11434` 或 docker 宿主机 IP 地址,如:`http://172.17.0.1:11434`。
|
||||
|
||||
若为本地源码部署,可填写 `http://localhost:11434`。
|
||||
|
||||
- 模型类型:对话
|
||||
|
|
|
|||
Loading…
Reference in New Issue