From 4e9ea6232fcf21856b214d3a35292dffed043dbe Mon Sep 17 00:00:00 2001 From: takatost Date: Fri, 12 Jan 2024 12:21:12 +0800 Subject: [PATCH] optimize ollama docs --- en/advanced/model-configuration/ollama.md | 1 + zh_CN/guides/model-configuration/ollama.md | 1 + 2 files changed, 2 insertions(+) diff --git a/en/advanced/model-configuration/ollama.md b/en/advanced/model-configuration/ollama.md index 20bdece..8f283a0 100644 --- a/en/advanced/model-configuration/ollama.md +++ b/en/advanced/model-configuration/ollama.md @@ -20,6 +20,7 @@ Dify supports integrating LLM and Text Embedding capabilities of large language ``` After successful launch, Ollama starts an API service on local port 11434, which can be accessed at `http://localhost:11434`. + For other models, visit [Ollama Models](https://ollama.ai/library) for more details. 3. Integrate Ollama in Dify diff --git a/zh_CN/guides/model-configuration/ollama.md b/zh_CN/guides/model-configuration/ollama.md index d153b68..0c531e7 100644 --- a/zh_CN/guides/model-configuration/ollama.md +++ b/zh_CN/guides/model-configuration/ollama.md @@ -20,6 +20,7 @@ Dify 支持接入 Ollama 部署的大型语言模型推理和 embedding 能力 ``` 启动成功后,ollama 在本地 11434 端口启动了一个 API 服务,可通过 `http://localhost:11434` 访问。 + 其他模型可访问 [Ollama Models](https://ollama.ai/library) 了解详情。 3. 在 Dify 中接入 Ollama