Teneo 7.3 at a Glance

Teneo 7.3 comes with a bunch of new features related to Generative AI, integrated use of Microsoft Conversational Language Understanding (CLU), Teneo Studio Web features, Teneo API versioning and much more. You can find all the details in the reference documentation on Teneo Developers.

In this article, we are going to highlight some of the new features and options that you get with the 7.3 update:

  • The Teneo Generative QnA template solution that lets you implement a RAG architecture with Teneo in no time.
  • The native CLU integration that enables you to create, train and manage a CLU model from inside Teneo Studio.
  • The new deferred intent classification mechanism, that saves you CLU calls (and costs!), when a call to a machine learning model is not needed.
  • The enhanced options for adding and evaluating test examples in your Class Manager in Teneo Studio.
  • The new Teneo API versioning, that sets you up for friction-free API changes future releases.

Teneo Generative QnA Template Solution

Teneo 7.3 introduces a new template solution that can be used as a starting point for a new project: Teneo Generative QnA

The Teneo Generative QnA template solution is a starter kit that gets you off to a quick start when wanting to implement handling of business-relevant user queries in an easy, efficient, and controlled manner. The setup makes use of a Retrieval Augmented Generation (RAG) architecture, in which the usage of LLM and GenAI technologies are grounded in your own collection of documents and websites to assure accuracy and relevance in the responses.

To start using the new template solution, create a new solution in Teneo Studio, choose English as language and select the new template solution in the drop-down menu.

The template solution contains all the Teneo content you need to start answering user questions with Generative AI in your project. The settings in the template solution are fully exposed and configurable, giving you full control to adapt, add or remove features: it can be fine-tuned, enhanced further, or just serve as inspiration when developing a new conversational AI project.

Follow the step-by-step instructions in the User Guide that is included in the resources of the solution, or follow the guide on Teneo Developers Teneo Generative QnA to adjust your template solution and set up your own data repository.

To learn more about the Teneo Generative QnA template solution, the RAG architecture and how this setup can benefit you in your project, watch the video below:

Native CLU Integration

Another major update that comes with Teneo 7.3 is the native CLU Integration. With the CLU integration you get the option to create, train and manage models with this state-of-the-art intent classifier directly from within Teneo Studio.

Watch the video below to learn how to

  • add your CLU account to Teneo Studio
  • find the CLU manager and train a CLU model
  • assign the CLU model to the solution and use the CLU classes in the triggers
  • use Tryout to see the CLU annotation
  • manage training examples
  • evaluate the performance of the model with Test Data Evaluation
  • include the CLU model in the stable version of your solution

CLU models can be used across solutions on the same environment when these are created in relation to each other using the branch functionality. CLU models are multilingual by default, so that means that sharing a model in a multi-language setup is an option out-of-the-box. Multi-brand projects that share the same set of intents but want to work in different solutions can also take advantage of and contribute to the same CLU model.

It is worth noting that the ability to add and manage the CLU account itself requires a certain level of user permissions: the user must either have the permission Teneo Admin, or the new user permission Modify Account Settings. Also, the CLU credentials are encrypted and becomes invisible once they have been saved to Teneo.

Keep in mind that using CLU as intent classifier is optional, and you must bring your own CLU account. Teneo Learn continues to be the out-of-the-box classifier and it is always available, also as a back-up classifier in case CLU has down-time or during development for on-the-spot classifier updates.

Deferred Intent Classification

With the introduction of the native CLU integration to Teneo Studio, a new approach to intent classification has also been introduced. In previous versions of Teneo, all inputs were sent to the classifier for intent classification, independently of the intent recognition method that was expected on the next step of the conversation (TLML or machine learning).

With Deferred intent classification, inputs will only be sent to the classifier when there is a reason to do so, in order words when the intent recognition could be based on a machine learned classification. If the intent recognition is handled by TLML only, the input will not be sent to the classifier.

This new approach addresses the need to avoid unnecessary calls to CLU as these come with an additional cost. In flows like the one below, the flow design clearly indicates that the second input is expected to fulfil a defined TLML match requirement, so there is no need to make a call to the classifier when one of the TLML match requirements is met.

Should the input not match either of the expected TLML match requirements and the intent recognition process needs to look for other possible matches, then the intent classification will kick in to make sure the Teneo Engine has all relevant information available in order to decide on the appropriate next step in the conversation.

Although the feature is particularly valuable when using the CLU classifier for intent classification, it is relevant to keep in mind that the deferred intent recognition also will be applied when the native Teneo Learn Classifier is used as that means that there will not always be a class annotation to inspect on an input.
While deferred input classification is the new standard, Tryout lets you use the previous, immediate input classification on all inputs. The drop-down in Tryout lets you select Forced Intent classification to simulate the previous behaviour. This can be useful when you want to test what classes an input could potentially get annotated with.

Forced Intent Classification in Tryout

Test Examples and Test Data Evaluation

We have already seen that Teneo 7.3 brings several changes in relation to machine learning. And there is yet another enhancement that will make working with machine learning in Teneo even better: Test Examples and the built in Test Data evaluation feature.

When you create your machine learned model in Teneo, whether it is based on CLU or Teneo Learn, you can now specify and/or link test data examples to the class and use those to validate the performance of the classes.

Linked Test Examples will automatically be fetched from the triggers and transitions that make use of the particular class in a class match.

Test Examples can also be added directly to the class: in the Class Manager window, individually or by dragging and dropping the examples in the window or via an import of the class that contains both training and test examples.

The Test Data is used by the Class Optimization tool, Test Data Evaluation, to verify the performance of the classes. The tool tests all the examples against the current model and it returns information about precision, recall and F1 score, as well as an overview of conflicting classes. The Test Data Evaluation results are insightful as they help you identify improvement points as well as keeping track of how your model evolves over time as you can compare the results with previous evaluations.

Teneo API Versioning

The last 7.3 update that we want to highlight in this article is the new Teneo API Versioning. Teneo comes with an extensive set of REST API endpoints that you can use to automate different tasks in your project – from running the Auto-test in Teneo Studio to extracting all your conversational logs from Teneo Inquire.

By adding versioning to the API endpoints we are preparing the scene for friction-free platform upgrades in the future in case breaking changes in the API calls are introduced. Going forward, Teneo will support the current API version (X) and the previous one (X-1), which means that you have time to upgrade to, test and release the system with the latest version of the API endpoint until the next platform upgrade is rolled out.

Using the version-specific API calls for business-critical tasks is particularly recommend as you will have the certainty that the API call continues to work after the upgrade. It could be that for your project it is critical to do an hourly data extraction from Inquire to populate a dashboard, or that you need to do automated publications from Teneo Studio, and for those cases you would want to look into using the version-specific API endpoint going forward.

Version agnostic API calls are also still supported, and they are recommended for those cases where there will be no impact on the business if the API call does not work after the upgrade.

Final Thoughts

This article is an introduction to the some of the new features and functionalities in Teneo 7.3. We have zoomed in on features related to LLMs and GenAI, with the new Teneo Generative QnA template solution, and on the native integration of CLU to the Teneo Studio platform, that both enables the developers to boost the natural language understanding in the project with advanced intent recognition methods. Furthermore, in the case of the Teneo Generative QnA setup with GenAI, automatic answer generation based on a custom knowledge base is enabled.

We have also seen how Deferred Intent Classification and Test Data Evaluation allows you to work smarter with your classifiers, while keeping the cost of the usage of CLU under control.

Finally, the introduction of API versioning is a step towards reducing the risk of breaking API calls after platform updates.

The list of updates is longer, and we encourage you read through the release notes for a full overview and for the complete detail of how all the new features work.

Let us know in the comments what you think of the new features. We look forward to hearing from you!

5 Likes