AI extensions
The Web Admin Suite has been extended by adding two pages containing AI extensions to the new AI menu:
- Prompts
- LLM connections
Prompts (#663049, #663649)
The Prompts page allows to manage the AI prompts which are used in ConSol CM. There are two kinds of prompts:
- System prompts are used by the product itself. They are created automatically when a new feature which makes use of AI is introduced. Usually, they do not need to be edited. Two prompts for the prompts wizard are added by default:
- prompt generation assistant DE
- prompt generation assistant EN
- Custom prompts are used in scripts which are part of the custom scene created to adjust the product to a specific use case. They are created and managed by the customer.
This allows to edit prompts separately from the code, which increases flexibility and makes it easy to adjust to the latest best practices in prompt engineering.
The following settings exist for prompts:
- Name: This is the name to reference the prompt in scripts.
- Type: When you create a prompt, the type is set to Custom prompt. Prompts used by ConSol CM itself are called System prompts. They cannot be deleted.
- Prompt text: The text of the prompt which is sent to an LLM.
- Description: A description for the prompt (optional).
The prompt wizard helps you with the creation of suitable prompts, lowering the entry barrier for AI integration. It consists of four steps which guide you through the information which you need to provide for crafting a high-quality prompt tailored to your specific use case.
You need to provide settings for the default connection on the LLM connections page to use the prompts wizard.
The option AI prompts has been added to the Staging export page, so that prompts can be exported separately. During import, the last modification date is considered, i.e. an existing prompt is only overwritten on the target system if it has a newer modification date in the import file.
The interface AIPromptService
has been added to the ConSol CM API to allow using prompts in scripts.
LLM connections (#663688)
The LLM connections page allows to manage connections to large language models. You can use a model provided by a company such as OpenAI, or a self-hosted model.
The following settings can be made:
- Provider type: Choose the provider. For OpenAI, Azure and Ollama, you can use the default implementation for communication. Choose Custom if you want to connect another provider. In this case, you might need to provide a script which implements the communication in the Script field.
- Model URL: Enter the URL where the model can be reached.
- Model name: Enter the exact name of the model which you are connecting to.
- API key: Enter your API key for the model.
- Token limit: Enter the maximum number of tokens which is sent to the AI. The default value is 500. You can increase the value if the responses are cut off, or you notice that not the whole input was considered. Please note that this will also increase the cost of the request.
- Use Privacy Purger: Set to true if personal data should be removed from the text before it is sent to the AI. The text will be sent to the Privacy Purger application hosted by ConSol first. The Privacy Purger will return the text with the personal data replaced by generic tags. This text is then sent to the AI.
- Privacy Purger URL: Enter the URL of the Privacy Purger. You can obtain it from the ConSol CM support or sales team.
- Privacy Purger password: Enter the password of the Privacy Purger. You can obtain it from the ConSol CM support or sales team.
- Script: You can select a script of the type AI integration if you want to provide custom code for sending data to the LLM. This might be needed if you add models of the type Custom, which expect a syntax not covered by the default implementation.
The settings are saved to system properties in the module cmas-core-server
. The property names start with ai.<name of the configuration>
.
An empty default configuration is added automatically on setup and update. You need to fill it out with the data of the desired LLM provider in order to use the prompts wizard on the Prompts page.
The interface AIClientService
has been added to the ConSol CM API to allow sending requests to the LLMs. If the configuration contains a script, this script is used for processing the request. Otherwise, the request is sent using the default implementation for the selected provider type.
The new script type AI integration has been added to the Scripts page. The script template contains sample code for sending a request to an LLM and to the privacy purger. In both methods, the configuration created on the LLM connections page must be provided as a parameter.