We are releasing a small interim update with improvements for multilingualism, the export filters and the term checks and are simplifying the use of our API:
- Filter within exports
- Fallback languages for language varieties
- Term checks: Ignore forbidden terms
- Create API tokens more easily
- OpenAPI-Yaml-Generator for ChatGPT-Actions
Filter within exports
Exports in Lexeri can now be restricted and defined more precisely according to other attributes in addition to the languages.
This can be helpful for a systematic further use of the terminology, e.g. if the terminology is to be gradually built up in missing languages in a third system.
In addition to the languages, you can now define the following attributes for exports:
- Term status
- Subject areas
- Tags
- missing languages
The corresponding filters are then set in the background and only terms that contain these attributes are output in the export.
The filter function is available for all individual export formats.
Fallback languages for language varieties
If you create language varieties in Lexeri, such as Austrian German or American English, you can now also select a fallback language.
The idea behind this is that in a language variety only a few terms deviate from the standard language, whilst the majority of the standard vocabulary applies.
This means that you do not have to maintain a complete standard vocabulary in the language variety, but instead only have to add the terms that differ to the language variety entries. If there is no variety in an entry, i.e. no deviation from the standard language, these entries will still be displayed in the search and also in the term checks.
The option to configure fallback languages can be found in the settings of your termbase.
Term checks: Ignore term hits
In future, you will be able to ignore terms found in the term checks.
If a term is incorrectly recognized at a certain point in a text, you will be able to ignore it by clicking on it.
Lexeri remembers this choice so that this term is no longer listed in the hit list for future term checks in this context.
Create API tokens more easily
If you want to use the Lexeri API to also use Lexeri functions in external systems, you can now simply create the API tokens required for authentication in the settings of your termbase.
In the "API" tab, you will find a button for creating an API token. When creating the token, you can select the duration of the token and whether it is allowed to make changes to the terminology.
OpenAPI-Yaml-Generator for ChatGPT-Actions
In the new API tab in the termbase settings, you will also find an option to generate an OpenAPI YAML. We have developed this API documentation so that it can be entered as an action in individual GPTs, for example. This allows you to give ChatGPT or a custom Microsoft Copilot access to the Lexeri API so that the terminology generated by these tools can be checked directly for accuracy, for example.