
Read our interview with Mathieu Changeat, co-founder of Dydu, as he discusses the challenges of AI and the strategy adopted by Dydu.
How is generative AI influencing Dydu’s conversational AI solutions in 2024?
Generative AI has been around for several years, but it gained a lot of traction with the emergence of ChatGPT at the end of 2022. The chatbot seemed to understand everything and provide surprisingly realistic answers.Many studies have since revealed biases, errors, and omissions in ChatGPT responses and language models in general. But first impressions stick. Today, our clients and prospects expect to spend less time than before setting up their chatbot. We’re therefore integrating LLMs into our solution to help our chatbots and callbots understand our users’ questions even better, with little administrative work, while still guaranteeing reliable answers.
How does Dydu ensure data security?
We designed our integrations to use any LLM: hosted in the cloud (GPT in Microsoft Azure, Mixtral at mistral.ai or vercel.ai, etc.) or hosted on OVH servers maintained by Dydu (Llama2).
Our minimum requirements include:
- A GDPR-compliant cloud solution (not yet the case for Open AI)
- Europe-based servers
When Dydu provides the hosting, we ensure GDPR compliance and servers in France, even for health data (HDS). No data leaves the Dydu infrastructure, nor is it used to train hosted models.
Can you skip the knowledge base construction process?
Language models, such as GPT4, have been known to “hallucinate,” i.e., provide made-up, and therefore false, answers, even when trained with a thousand billion parameters. We recommend a hybrid approach. Part of the bot’s knowledge is created manually and managed by the knowledge base in the Bot Management System. The other part draws on the client’s existing intranet or website documents. These documents also help to better understand the manually created knowledge questions, based on the language model’s understanding, while providing an answer from the knowledge tree.
They should not be used to answer all questions, though. We recommend managing the following topics in the Dydu knowledge base:
- Of a sensitive nature
- Requiring escalation to another channel depending on the theme in question
- Requiring connection to an API (Application Programming Interface) on the client’s IS
There are also financial and environmental factors to consider – a request sent to a language model uses greater energy resources and is more costly than one sent to a Dydu chatbot.
LLMs are currently unsuitable for callbots. They take several seconds to provide an answer, which then has to be vocalized, making the final conversational experience unnatural.
Why do LLMs not always answer the same question in the same way?
The model is based on neural networks using stochastic methods, which means it can produce slightly different results each time. What’s more, the answer may depend on the context of the conversation and the exact wording of the question. Answer variability is therefore normal with this type of model.
What would you recommend to companies looking to adapt to the challenges of AI?
Remain cautious about the use cases you want to implement and start with tests. I think AI can be very useful for various day-to-day tasks, but don’t overlook biases and imperfections or human know-how.