Skip to main content
Version: 6.18

AI features

AI features can be used to facilitate the configuration of ConSol CM in the Web Admin Suite and to implement quality-of-life features for the end users in the Web Client. They usually consist of API methods. In some cases, they have GUI elements as well.

The following AI features exist:

  • Automatic translation: Generate translations for localized names and descriptions in the Web Admin Suite, and use automatic translations in scripts, e.g. to automatically translate incoming communication to English.
  • Case summary: Generate a summary of the case communication in scripts, e.g. to allows end users to get a quick overview of the case.

Automatic translation

Automatic translation requires an external translation provider, either an LLM (Large Language Model) or DeepL. It simplifies the localization of ConSol CM into multiple languages by eliminating the need to manually enter translations.

warning

The translation provider is not included with ConSol CM. You must supply your own API credentials.

If automatic translation is configured, a Translate icon is shown in the following places:

  • Internal name: Only if an LLM is configured as a translation provider, generates a user-friendly label based on the technical name and uses it as the localized name in the default language. This label is translated into the other languages.
  • Localized name: Translates the field value to the other languages. Existing values in other languages will be overwritten.
  • Files tab of the portal configuration: Opens a modal window where the user can select the source language and one or more target languages. The localization.json and components_localization.json files for the selected target languages are created automatically. The multilingual JSON files (e.g. public.json, welcome.json) are extended with the new language content.
Best practice

It is recommended to use English as the source language for better translation quality. DeepL typically provides faster results than LLMs.

Configuring automatic translation

Automatic translation is configured on the System properties page.

Using DeepL

Please proceed as follows to configure DeepL:

  1. Set the property cmas-core-server, translationService.provider to deepl.
  2. The property cmas-core-server, translationService.deepl.apiUrl is preconfigured with the current DeepL API URL. Change this only if necessary.
  3. Enter your DeepL API key in the property cmas-core-server, translationService.deepl.apiKey.
warning

The Privacy Purger is not used when using DeepL.

Using an LLM

Please proceed as follows to configure an LLM:

  1. Set the property cmas-core-server, translationService.provider to llm.
  2. Configure the LLM connection on the LLM connections page.
  3. If you do not want to use the default connection, enter the desired connection name in the property cmas-core-server, translationService.llm.configurationName.
  4. The prompt used for translation is called the translation prompt. It is automatically created as a system prompt on the Prompts page, where you can fine-tune it if needed. Alternatively, you can define a custom prompt and assign it using the property cmas-core-server, translationService.llm.promptName.
info

The system properties cmas-core-server, translationService.llm.debugEnabled and cmas-core-server, translationService.llm.jsonSchemaEnabled are intended for debugging and do not require configuration in most cases.

info

The Privacy Purger is not used when configured in the LLM connection.

Using automatic translations in scripts

You can use the method translate of the class TranslationService to translate a string into one or more target languages. The following example translates a string from English to German:

translationService.translate("my text", "en", "de")

Case summary

Case summary requires an LLM (Large Language Model). It allows you to generate a summary of the case communication by script. For example, you can include an activity to generate a summary to your workflow to that the users can get quick overview of the communication which took place in the case.

warning

The LLM is not included with ConSol CM. You must supply your own API credentials.

Configuring case summary

The case summary feature is configured on the System properties page.

Please proceed as follows to configure an LLM:

  1. Configure the LLM connection on the LLM connections page.
  2. If you do not want to use the default connection, enter the desired connection name in the property cmas-core-server, summaryService.llm.configurationName.
  3. The prompt used for summarizing is called the summary prompt. It is automatically created as a system prompt on the Prompts page, where you can fine-tune it if needed. Alternatively, you can define a custom prompt and assign it using the property cmas-core-server, summaryService.llm.promptName.
info

The Privacy Purger is not used when configured in the LLM connection.

Using case summary in scripts

You can use the method summarizeTicket of the class TicketService. There are three method signatures available, which allow you to define:

  • whether the generated summary should be written to the case history
  • whether the entire case communication or only specific case entries should be included

The following example shows how to write the summary of the whole case communication to a comment with the text class Summary:

 summarizeTicket(ticket, true, contentEntryClassService.getByName("summary"))