Skip to content
  • AI Model

Select the LLM model used in the application.

  • Prompt

Prompt words are used to guide the model to generate text content. When using the LLM model to generate text, we usually need to provide a prompt word, that is, enter a piece of text or keywords to guide the model to generate the next content based on this prompt word. Prompt words can help the model understand the user's intentions and topics, thereby generating text that is more in line with user needs. By rationally designing prompt words, the content and style of the generated text can be controlled to a certain extent, making the generated text more accurate and meeting needs.

  • Temperature

The temperature parameter is used to control the diversity of output generated by the model. That is, by adjusting the temperature parameter, the creativity and randomness of the text generated by the model can be controlled. Higher temperature parameters lead to the model generating more diverse and random text, while lower temperature parameters lead to the generation of more deterministic and consistent text. Therefore, adjusting the temperature parameters can balance the diversity and accuracy of generated text according to specific application needs.

  • Response limit

Set the upper limit of the reply, which is the maximum length of Tokens in the text generated by the model. If set to 0, it means no limit. At this time, the maximum length depends on the maximum output length limit of the model. Note that it is the maximum length of the reply Tokens! Not context tokens.

  • Citation Content Templates
This configuration takes effect only when reference content is passed in (knowledge base search).
You can customize the structure of the reference content to better suit different scenarios. Some variables can be used for template configuration:
{{q}} - retrieve content, {{a}} - expected content, {{source}} - source, {{sourceId}} - source file name, {{index}} - the first n references, {{with}} - the reference points (0-1), they are optional, Here are the default values:
{{default}}
<data>
{{q}}
{{a}}    
</data>
  • Quote content prompts
This configuration takes effect only when the knowledge base is searched.
You can use {{quote}} to insert the reference content template and {{question}} to insert the question. Here are the default values:
Use the contents of the <data></data> tag as your knowledge:

{{quote}}

Answer the request:
- If you are not clear about the answer, you need to clarify.
- Avoid referring to the knowledge you gained from the data.
- Keep your answer as described in the data.
- Use Markdown syntax to optimize the format of your answer.
- Answer in the same language as the question.

question:"{{question}}""
  • Choose Dataset

An application can be associated with multiple knowledge bases. Each time it talks to a user, the system first vectorizes the user's query and then performs a similarity comparison with the associated knowledge base. When the similarity exceeds the set threshold, the relevant content will be passed to the LLM model as context. Users can customize the similarity threshold and the maximum length of query results. The larger the similarity value, the higher the data matching, while the smaller the value, the more flexible the matching.

  • Variable

Der Benutzer kann aufgefordert werden, etwas als spezifische Variable für die Runde einzugeben, bevor der Dialog beginnt. Dieses Modul befindet sich nach dem Eröffnungsvorspann. Variablen können als {variable key} in andere Module mit stringartigen Eingaben eingefügt werden, z.B. Prompts, Qualifier, etc.

  • Variable

Sends an initial content before each conversation starts. Standard Markdown syntax is supported, with additional markup available: [QuickButton]: The user clicks on the button to send the question directly. [shortcut button]: users can send the question directly after clicking on it

The documentation on this page is being continuously improved. If you have any questions about AI configuration and would like our engineers to provide assistance, please contact us