From c19df903d430c9b45caff4e2b04b3aaeef30019e Mon Sep 17 00:00:00 2001 From: crazywoola <100913391+crazywoola@users.noreply.github.com> Date: Wed, 20 Mar 2024 18:13:18 +0800 Subject: [PATCH] refine docs (#44) --- en/tutorials/model-configuration/ollama.md | 21 +++++++++++++++++---- zh_CN/guides/model-configuration/ollama.md | 22 ++++++++++++++++++++-- 2 files changed, 37 insertions(+), 6 deletions(-) diff --git a/en/tutorials/model-configuration/ollama.md b/en/tutorials/model-configuration/ollama.md index 58a7dcd..06a0d48 100644 --- a/en/tutorials/model-configuration/ollama.md +++ b/en/tutorials/model-configuration/ollama.md @@ -63,11 +63,17 @@ Dify supports integrating LLM and Text Embedding capabilities of large language Enter `Prompt Eng.` page of the App that needs to be configured, select the `llava` model under the Ollama provider, and use it after configuring the model parameters. -For more information on Ollama, please refer to: [https://github.com/jmorganca/ollama](https://github.com/jmorganca/ollama) - ## FAQ -For more information on Ollama, please refer to: [Ollama FAQ](https://github.com/ollama/ollama/blob/main/docs/faq.md) +### ⚠️ If you are using docker to deploy Dify and Ollama, you may encounter the following error: + +``` +httpconnectionpool(host=127.0.0.1, port=11434): max retries exceeded with url:/cpi/chat (Caused by NewConnectionError(': fail to establish a new connection:[Errno 111] Connection refused')) + +httpconnectionpool(host=localhost, port=11434): max retries exceeded with url:/cpi/chat (Caused by NewConnectionError(': fail to establish a new connection:[Errno 111] Connection refused')) +``` + +This error occurs because the Ollama service is not accessible from the docker container. `localhost` usually refers to the container itself, not the host machine or other containers. To resolve this issue, you need to expose the Ollama service to the network. ### Setting environment variables on Mac @@ -120,4 +126,11 @@ On windows, Ollama inherits your user and system environment variables. ## How can I expose Ollama on my network? -Ollama binds 127.0.0.1 port 11434 by default. Change the bind address with the `OLLAMA_HOST` environment variable. \ No newline at end of file +Ollama binds 127.0.0.1 port 11434 by default. Change the bind address with the `OLLAMA_HOST` environment variable. + +## More Information + +For more information on Ollama, please refer to: + +- [Ollama](https://github.com/jmorganca/ollama) +- [Ollama FAQ](https://github.com/ollama/ollama/blob/main/docs/faq.md) diff --git a/zh_CN/guides/model-configuration/ollama.md b/zh_CN/guides/model-configuration/ollama.md index ec28011..81c9af8 100644 --- a/zh_CN/guides/model-configuration/ollama.md +++ b/zh_CN/guides/model-configuration/ollama.md @@ -63,7 +63,17 @@ Dify 支持接入 Ollama 部署的大型语言模型推理和 embedding 能力 进入需要配置的 App 提示词编排页面,选择 Ollama 供应商下的 `llava` 模型,配置模型参数后即可使用。 -如需获取 Ollama 更多信息,请参考:[https://github.com/jmorganca/ollama](https://github.com/jmorganca/ollama) +## FAQ + +### ⚠️ If you are using docker to deploy Dify and Ollama, you may encounter the following error: + +``` +httpconnectionpool(host=127.0.0.1, port=11434): max retries exceeded with url:/cpi/chat (Caused by NewConnectionError(': fail to establish a new connection:[Errno 111] Connection refused')) + +httpconnectionpool(host=localhost, port=11434): max retries exceeded with url:/cpi/chat (Caused by NewConnectionError(': fail to establish a new connection:[Errno 111] Connection refused')) +``` + +这个错误是因为 Docker 容器无法访问 Ollama 服务。localhost 通常指的是容器本身,而不是主机或其他容器。要解决此问题,您需要将 Ollama 服务暴露给网络。 ### 在Linux上设置环境变量 @@ -103,4 +113,12 @@ Dify 支持接入 Ollama 部署的大型语言模型推理和 embedding 能力 ### 如何在我的网络上暴露Ollama? -Ollama默认绑定127.0.0.1端口11434。通过`OLLAMA_HOST`环境变量更改绑定地址。 \ No newline at end of file +Ollama默认绑定127.0.0.1端口11434。通过`OLLAMA_HOST`环境变量更改绑定地址。 + +## 如需获取 Ollama 更多信息,请参考 + +For more information on Ollama, please refer to: + +- [Ollama](https://github.com/jmorganca/ollama) +- [Ollama FAQ](https://github.com/ollama/ollama/blob/main/docs/faq.md) +