update doc

pull/39/head^2
crazywoola 2024-02-23 20:01:03 +08:00
parent 4732ff16ca
commit cfe3a162c3
2 changed files with 101 additions and 1 deletions

View File

@ -64,3 +64,60 @@ Dify supports integrating LLM and Text Embedding capabilities of large language
Enter `Prompt Eng.` page of the App that needs to be configured, select the `llava` model under the Ollama provider, and use it after configuring the model parameters.
For more information on Ollama, please refer to: [https://github.com/jmorganca/ollama](https://github.com/jmorganca/ollama)
## FAQ
For more information on Ollama, please refer to: [Ollama FAQ](https://github.com/ollama/ollama/blob/main/docs/faq.md)
### Setting environment variables on Mac
If Ollama is run as a macOS application, environment variables should be set using `launchctl`:
1. For each environment variable, call `launchctl setenv`.
```bash
launchctl setenv OLLAMA_HOST "0.0.0.0"
```
2. Restart Ollama application.
### Setting environment variables on Linux
If Ollama is run as a systemd service, environment variables should be set using `systemctl`:
1. Edit the systemd service by calling `systemctl edit ollama.service`. This will open an editor.
2. For each environment variable, add a line `Environment` under section `[Service]`:
```ini
[Service]
Environment="OLLAMA_HOST=0.0.0.0"
```
3. Save and exit.
4. Reload `systemd` and restart Ollama:
```bash
systemctl daemon-reload
systemctl restart ollama
```
### Setting environment variables on Windows
On windows, Ollama inherits your user and system environment variables.
1. First Quit Ollama by clicking on it in the task bar
2. Edit system environment variables from the control panel
3. Edit or create New variable(s) for your user account for `OLLAMA_HOST`, `OLLAMA_MODELS`, etc.
4. Click OK/Apply to save
5. Run `ollama` from a new terminal window
## How can I expose Ollama on my network?
Ollama binds 127.0.0.1 port 11434 by default. Change the bind address with the `OLLAMA_HOST` environment variable.

View File

@ -63,4 +63,47 @@ Dify 支持接入 Ollama 部署的大型语言模型推理和 embedding 能力
进入需要配置的 App 提示词编排页面,选择 Ollama 供应商下的 `llava` 模型,配置模型参数后即可使用。
如需获取 Ollama 更多信息,请参考:[https://github.com/jmorganca/ollama](https://github.com/jmorganca/ollama)
如需获取 Ollama 更多信息,请参考:[https://github.com/jmorganca/ollama](https://github.com/jmorganca/ollama)
以下是上述内容的中文翻译:
### 在Linux上设置环境变量
如果Ollama作为systemd服务运行应该使用`systemctl`设置环境变量:
1. 通过调用`systemctl edit ollama.service`编辑systemd服务。这将打开一个编辑器。
2. 对于每个环境变量,在`[Service]`部分下添加一行`Environment`
```ini
[Service]
Environment="OLLAMA_HOST=0.0.0.0"
```
3. 保存并退出。
4. 重载`systemd`并重启Ollama
```bash
systemctl daemon-reload
systemctl restart ollama
```
### 在Windows上设置环境变量
在Windows上Ollama继承了您的用户和系统环境变量。
1. 首先通过任务栏点击Ollama退出程序
2. 从控制面板编辑系统环境变量
3. 为您的用户账户编辑或新建变量,比如`OLLAMA_HOST`、`OLLAMA_MODELS`等。
4. 点击OK/应用保存
5. 在一个新的终端窗口运行`ollama`
## 如何在我的网络上暴露Ollama
Ollama默认绑定127.0.0.1端口11434。通过`OLLAMA_HOST`环境变量更改绑定地址。