GITBOOK-67: No subject
parent
84185f413a
commit
529e7c3e26
|
|
@ -17,14 +17,14 @@ In knowledge base retrieval scenarios, the query variable typically represents t
|
|||
|
||||
**Choosing the Knowledge Base for Query**
|
||||
|
||||
Within the knowledge base retrieval node, you can add an existing knowledge base from Dify. For instructions on creating a knowledge base within Dify, please refer to the knowledge base [help documentation](https://docs.dify.ai/v/zh-hans/guides/knowledge-base).
|
||||
Within the knowledge base retrieval node, you can add an existing knowledge base from Dify. For instructions on creating a knowledge base within Dify, please refer to the knowledge base [help documentation](https://docs.dify.ai/features/datasets).
|
||||
|
||||
**Configuring the Retrieval Strategy**
|
||||
|
||||
It's possible to modify the indexing strategy and retrieval mode for an individual knowledge base within the node. For a detailed explanation of these settings, refer to the knowledge base [help documentation](https://docs.dify.ai/v/zh-hans/learn-more/extended-reading/retrieval-augment/hybrid-search).
|
||||
It's possible to modify the indexing strategy and retrieval mode for an individual knowledge base within the node. For a detailed explanation of these settings, refer to the knowledge base [help documentation](https://docs.dify.ai/features/retrieval-augment/hybrid-search).
|
||||
|
||||
<figure><img src="../../../.gitbook/assets/image (49).png" alt=""><figcaption></figcaption></figure>
|
||||
|
||||
Dify offers two recall strategies for different knowledge base retrieval scenarios: "N-choose-1 Recall" and "Multi-way Recall". In the N-choose-1 mode, knowledge base queries are executed through function calling, requiring the selection of a system reasoning model. In the multi-way recall mode, a Rerank model needs to be configured for result re-ranking. For a detailed explanation of these two recall strategies, refer to the retrieval mode explanation in the [help documentation](https://docs.dify.ai/v/zh-hans/learn-more/extended-reading/retrieval-augment/retrieval).
|
||||
Dify offers two recall strategies for different knowledge base retrieval scenarios: "N-choose-1 Recall" and "Multi-way Recall". In the N-choose-1 mode, knowledge base queries are executed through function calling, requiring the selection of a system reasoning model. In the multi-way recall mode, a Rerank model needs to be configured for result re-ranking. For a detailed explanation of these two recall strategies, refer to the retrieval mode explanation in the [help documentation](https://docs.dify.ai/features/retrieval-augment/retrieval).
|
||||
|
||||
<figure><img src="../../../.gitbook/assets/image (51).png" alt=""><figcaption></figcaption></figure>
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@ Configuring an LLM node primarily involves two steps:
|
|||
|
||||
**Model Configuration** 
|
||||
|
||||
Before selecting a model suitable for your task, you must complete the model configuration in "System Settings—Model Provider". The specific configuration method can be referenced in the [model configuration instructions](https://docs.dify.ai/v/zh-hans/guides/model-configuration). After selecting a model, you can configure its parameters.
|
||||
Before selecting a model suitable for your task, you must complete the model configuration in "System Settings—Model Provider". The specific configuration method can be referenced in the [model configuration instructions](https://docs.dify.ai/tutorials/model-configuration#model-integration-settings). After selecting a model, you can configure its parameters.
|
||||
|
||||
<figure><img src="../../../.gitbook/assets/image (10) (1).png" alt=""><figcaption></figcaption></figure>
|
||||
|
||||
|
|
|
|||
|
|
@ -12,6 +12,6 @@ Configuring the Question Classifier Node involves three main components:
|
|||
|
||||
**Selecting the Input Variable** In conversational customer scenarios, you can use the user input variable from the "Start Node" (sys.query) as the input for the question classifier. In automated/batch processing scenarios, customer feedback or email content can be utilized as input variables.
|
||||
|
||||
**Configuring the Inference Model** The question classifier relies on the natural language processing capabilities of the LLM to categorize text. You will need to configure an inference model for the classifier. Before configuring this model, you might need to complete the model setup in "System Settings - Model Provider". The specific configuration method can be found in the [model configuration instructions](https://docs.dify.ai/v/zh-hans/guides/model-configuration). After selecting a suitable model, you can configure its parameters.
|
||||
**Configuring the Inference Model** The question classifier relies on the natural language processing capabilities of the LLM to categorize text. You will need to configure an inference model for the classifier. Before configuring this model, you might need to complete the model setup in "System Settings - Model Provider". The specific configuration method can be found in the [model configuration instructions](https://docs.dify.ai/tutorials/model-configuration#model-integration-settings). After selecting a suitable model, you can configure its parameters.
|
||||
|
||||
**Writing Classification Conditions** You can manually add multiple classifications by composing keywords or descriptive sentences that fit each classification. Based on the descriptions of these conditions, the question classifier can route the dialogue to the appropriate process path according to the semantics of the user's input.
|
||||
|
|
|
|||
|
|
@ -9,4 +9,4 @@ Configuring a tool node generally involves two steps:
|
|||
1. **Authorizing the Tool/Creating Custom Tools**
|
||||
2. **Configuring Tool Inputs and Parameters**
|
||||
|
||||
For guidance on creating custom tools and configuring them, please refer to the tool configuration instructions.
|
||||
For guidance on creating custom tools and configuring them, please refer to the [tool configuration instructions](https://docs.dify.ai/tutorials/quick-tool-integration).
|
||||
|
|
|
|||
Loading…
Reference in New Issue