Skip to main content
Version: 6.18 (draft)

LLM connections

Introduction to LLM connections

The LLM connections page allows to manage the connections to various language models. You can use a model provided by a company such as OpenAI, or a self-hosted model.

Settings for LLM connections

The following settings can be made:

  • Provider type: Select OpenAI, Azure or Ollama to use a pre-supported provider, which comes with a built-in configuration. Select Custom if you need to connect a model that does not have standard support. In this case, you may need to create a script that implements the communication.
  • Model URL: Enter the URL where the model can be accessed. This URL acts as the bridge between your system and the AI model.
  • Model name: Specify the exact name of the model you wish to connect to.
  • API key: Enter your API key for the model. It is provided by the provider of the LLM.
  • Token limit: Tokens are units of text processed by the AI. Setting a limit controls how much data you can send in one request. The default value is 500. You can increase this value if responses are cut off, or you notice that the entire input was not considered. Higher limits allow for more comprehensive processing but may increase costs.
  • Use Privacy Purger: A feature to enhance data privacy by removing personal information from texts before they are sent to the AI model. Select True if personal data should be removed from the text before sending it to the AI. The text will first be sent to the Privacy Purger application hosted by ConSol. The Privacy Purger returns text where personal data has been replaced by general tags. This text is then sent to the AI.
  • Privacy Purger URL: Enter the URL to the Privacy Purger URL. You can obtain the URL from the ConSol CM Support or Sales team.
  • Privacy Purger password: Enter the Privacy Purger password. You can obtain it from the ConSol CM Support or Sales team.
  • Script: You can select an AI integration type script if you want to use your own code to send data to the LLM. This might be necessary when adding Custom models that expect a syntax not covered by the standard implementation.

Basic tasks

Working with LLM connections

You can perform the following actions on LLM connections:

  • Configure the default LLM connection: The default connection is created automatically. Provide the connection settings and click the Update button to save the settings. This action is mandatory if you want to use the prompt wizard, see Using the prompt wizard.
  • Create a new LLM connection: Click the New configuration in the header and enter the name of the configuration. Afterwards you can provide the connection settings.
  • Update an LLM connection: Modify the connection settings and click the Update button.
  • Test an LLM connection: Click the Test connection button to check if the connection works correctly.
  • Rename an LLM connection: Click the Rename configuration button in the header to change the name of the connection.
  • Remove an LLM connection: Click the Remove configuration button in the header to delete the connection.

The settings are saved to system properties.

Advanced tasks

Using LLM connections in scripts

Use the methods from AIClientService to send requests to an LLM. You need to pass the name of the configuration, and the system and user message as parameters:

aIClientService.send("myConfiguration", "mySystemMessage", "myUserMessage")

If the request should be sent to the Privacy Purger first, use the following method:

aIClientService.send("myConfiguration", "mySystemMessage", "myUserMessage", true)