Lifecycle Management

Manage your AI Assistant from the idea to the agile rollout of new valuable features.

A/B Testing

The A/B Testing feature of the Platform allows you to compare two AI Assistants in live operation. In this way, you can test and evaluate all aspects of a it - from small wording adjustments to different dialogue strategies or differently trained NLU models. You can flexibly adapt the playout of the variants to user segments.

Deployment Process provides a fully managed, fast, and transparent deployment process to activate and test the updates of your AI Assistant on the development environment. It contains all functional updates as well as administration, training, and provision of your machine learning model.


Manage the development process of your AI Assistant with the highest professional standards. offers development, test and live environments as well as a revision history. This means that you do not have to extend the functions of your AI Assistant while the system is running, and you can test it extensively before it is put into operation.

One-click Deployment and Hosting
Deployment Status

Unit Testing

Define Unit Tests to check the functionality of the entire AI Assistant for errors at each deployment. The unit tests are mapped in the form of predefined test dialogues and can be defined at both message and intent level. In addition to the dialog structure of the AI Assistant, you also test the performance of the NLU model and the dialog context.