optimize ollama docs
parent
63149a522f
commit
3f6e2e5019
|
|
@ -31,6 +31,7 @@ Dify supports integrating LLM and Text Embedding capabilities of large language
|
|||
- Model Name: `llava`
|
||||
|
||||
- Base URL: `http://<your-ollama-endpoint-domain>:11434`
|
||||
|
||||
Enter the base URL where the Ollama service is accessible.
|
||||
If Dify is deployed using docker, consider using the local network IP address, e.g., `http://192.168.1.100:11434` or the docker host machine IP address, e.g., `http://172.17.0.1:11434`.
|
||||
For local source code deployment, use `http://localhost:11434`.
|
||||
|
|
@ -38,12 +39,15 @@ Dify supports integrating LLM and Text Embedding capabilities of large language
|
|||
- Model Type: Chat
|
||||
|
||||
- Model Context Length: 4096
|
||||
|
||||
The maximum context length of the model. If unsure, use the default value of 4096.
|
||||
|
||||
- Maximum Token Limit: 4096
|
||||
|
||||
The maximum number of tokens returned by the model. If there are no specific requirements for the model, this can be consistent with the model context length.
|
||||
|
||||
- Support for Vision: Yes
|
||||
|
||||
Check this option if the model supports image understanding (multimodal), like `llava`.
|
||||
|
||||
Click "Save" to use the model in the application after verifying that there are no errors.
|
||||
|
|
|
|||
|
|
@ -31,6 +31,7 @@ Dify 支持接入 Ollama 部署的大型语言模型推理和 embedding 能力
|
|||
- 模型名称:`llava`
|
||||
|
||||
- 基础 URL:`http://<your-ollama-endpoint-domain>:11434`
|
||||
|
||||
此处需填写可访问到的 Ollama 服务地址。
|
||||
若 Dify 为 docker 部署,建议填写局域网 IP 地址,如:`http://192.168.1.100:11434` 或 docker 宿主机 IP 地址,如:`http://172.17.0.1:11434`。
|
||||
若为本地源码部署,可填写 `http://localhost:11434`。
|
||||
|
|
@ -38,12 +39,15 @@ Dify 支持接入 Ollama 部署的大型语言模型推理和 embedding 能力
|
|||
- 模型类型:对话
|
||||
|
||||
- 模型上下文长度:4096
|
||||
|
||||
模型的最大上下文长度,若不清楚可填写默认值 4096。
|
||||
|
||||
- 最大 token 上限:4096
|
||||
|
||||
模型返回内容的最大 token 数量,若模型无特别说明,则可与模型上下文长度保持一致。
|
||||
|
||||
- 是否支持 Vision:是
|
||||
|
||||
当模型支持图片理解(多模态)勾选此项,如 `llava`。
|
||||
|
||||
点击 "保存" 校验无误后即可在应用中使用该模型。
|
||||
|
|
|
|||
Loading…
Reference in New Issue