LLM-Chatbots: Best Practices

LLM chatbots: Best practices


Chatbots and voicebots have undergone a significant change in terms of quality and implementation methods through the integration of Large Language Models (LLMs). In the past, companies had to laboriously predefine every question and answer, which made the development process lengthy and inflexible. On the user side, too, the experience was usually rather sobering, as the predefined answers were rather general and not very user-centered. Today, LLMs and methods such as Retriever Augmented Generation (RAG) make it possible to train chatbots and voicebots quickly and efficiently so that they are able to communicate in a very specific and target group-oriented way. This initially facilitates development and implementation, as the bots can now respond dynamically to a wide range of requests without having to program every possible conversation scenario in advance. At the same time, it has taken the customer experience to a whole new level. The quality of the answers has increased many times over in terms of correctness and individuality.

In the following article you will find a selection of LLM chatbots. The list is constantly being expanded. It is therefore worth taking a regular look.


To repeat

What is an LLM chatbot?

LLM chatbots, or Large Language Model chatbots, are advanced AI systems that use Generative AI to understand and generate human language. These intelligent chatbots are based on large language models such as GPT-4 or other open source models that have been trained with enormous amounts of text data to develop a deep understanding of context, syntax and semantics. This advanced language processing allows LLM chatbots to perform a variety of tasks, from answering questions and creating content to automating customer support.

Methods such as Retriever Augmented Generation (RAG) play an important role in connection with LLM chatbots. RAG combines the capabilities of a retrieval system, which retrieves relevant documents or information from a database, with the generation capability of a large language model. This enables LLM chatbots not only to respond based on the trained model, but also to integrate specific, contextual information from the company’s own sources in order to generate more precise and informed answers. The use of RAG therefore significantly extends the functionality of LLM chatbots by allowing companies to customize the knowledge of the chatbot. Companies can even define that the LLM chatbots should only access the content provided by the company. This ensures that the bot does not access unwanted or incorrect information.


Clara from Helvetia Insurance Switzerland

The first LLM chatbot from Switzerland is called Clara and is from Helvetia Insurance. The LLM chatbot was developed and published in a pilot project at the beginning of 2023. During this phase, the chatbot only drew on the knowledge of the insurance website and used OpenAI’s GPT-3 language model to answer customers’ and potential customers’ questions about insurance. More details about the Helvetia experiment in my experience report.

In further iterations, the insurance company has supplemented its knowledge and expertise with additional internal connections. As of 07.2024, the chatbot also uses selected rule-based chat flows, which still originate from the earlier intent-based chatbot project. These are primarily triggered when certain insurance processes, such as a claims process, are to be run through. However, the aim is to replace these flows with LLMs and RAGs in the future.

As can be seen in the pictures, the chatbot occasionally uses emojis despite the rather conservative insurance industry and adapts its tonality to that of its users.

More background information on the LLM chatbot at Helvetia in my interview with Florian Nägele on LLM chatbots in the insurance industry.

Lou from the Luzerner Kantonalbank

Luzerner Kantonalbank originally had a rule-based chatbot. This is now being converted step by step into an LLM chatbot. Luzerner Kantonalbank is making the changeover subject-related. In a first step, only all rule-based flows relating to the Twint payment service were supplemented by the LLM chatbot. As of 07.2024, the chatbot has no interfaces to internal systems. Customers should not enter any personal data into the chatbot in order to avoid any data protection risks.

The Migros Bank digital assistant

Migros Bank’s LLM-based chatbot can be found via the contact page. Like most LLM chatbots, the bot starts with comprehensive information on data protection and then asks the user how it can help. It is very interesting that the LLM chatbot seems to be trained to always keep the conversation going and writes most dialogs with a follow-up question to the user. For example, if a user asks about an account product, the LLM chatbot first provides information on account products and then tries to find out more about the user’s needs in order to provide even better advice.

The digital assistant of Suva accident insurance

Suva Accident Insurance has offered an LLM chatbot on its website since the end of 2023. In an initial phase, the chatbot can only answer general questions, most of which it obtains from the Suva Insurance website. As of 07.2027, the LLM chatbot has a rather conservative tone of voice that hardly adapts to the individual user. Suva Versicherung’s LLM chatbot uses the OpenAI language model.

Klarna’s chatbot speaks over 35 languages

The financial company Klarna announced its partnership with OpenAI in 2023 and also published its LLM chatbot. Klarna’s LLM chatbot was recognized by the media as a very successful project right from the start, having 2.3 million conversations in its first month, accounting for two thirds of customer service chats. Almost from the outset, it is able to replace the work of 700 full-time employees while maintaining customer satisfaction at the same level as human employees. The resolution of user queries is more precise, which has led to a 25 percent reduction in follow-up queries. Customers receive answers in less than 2 minutes, compared to 11 minutes previously. The chatbot is available around the clock in 23 stores and can communicate in more than 35 languages. The introduction of the LLM chatbot is expected to increase Klarna’s profits by USD 40 million by 2024.

If customers have authenticated themselves or are logged in, the LLM chatbot can even answer individual questions about account balances, etc.

Jumbot from Jumbo Baumarkt

The Swiss DIY and hobby store Jumbo launched an LLM chatbot in mid-2023 to advise website visitors. The bot acts as a product advisor and is available via the website. Customers can ask their questions about product details or product recommendations and the chatbot responds based on its own knowledge base. The knowledge base was compiled by the Jumbo-Digital team and roughly contains the website content as well as further product detail documents. The first goal of the LLM chatbot was to offer customers an improved experience on the website by allowing them to get advice around the clock and feel more confident in their purchase decision. This first goal was achieved. Now the chatbot is to be given more and more functionalities in further iterations.

As of 07.2024, the LLM bot usually displays very long and extensive answers. LLMs tend to give very detailed answers, which can sometimes impair the user experience. However, this is to be optimized in further iterations. Read more about the JumBot chatbot in my article on LLM bots in retail.

The Shopping Assistant from Zalando

The online store Zalando has been offering an LLM-based shopping assistant since 2024. However, the Assistant can only be used with an existing customer account. Customers first have to log in and then receive their AI-supported shopping advice. Zalando itself promises that customers can use the Fashion Assistant to search for products in their own words and thus filter the range more precisely. You will also receive personalized advice to help you find the right look from the wide range of products. Ultimately, better advice should of course lead to more sales and fewer returns.

Julia the assistants from Weltbild.de

First of all, it should be mentioned that Weltbild uses a combination of LLM chatbot and rule-based bot. The chat starts with a short greeting and the user can choose between different topics. Only when there is no topic does the chatbot announce that it is now using its LLM model and working with the GPT language model.

The LLM chatbot then mainly uses the website as a source and gives the user comprehensive yet brief answers. Furthermore, the LLM chatbot always displays its sources.

Digi from Migros Do It Garden

Digi presents itself to customers as a digital assistant for product advice. And as of summer 2024, this corresponds very closely to the current range of functions. When users ask the LLM chatbot a question about products, they usually receive a rather comprehensive and precise answer. The LLM chatbot then even understands follow-up questions relating to the suggested products. However, when users ask the LLM chatbot about delivery conditions etc., it becomes more difficult. Here, the LLM chatbot quickly refers to human customer advisors. Users can also select “Video advice” in the chat. With this option, they are then connected to a human employee.

Code from GS1

GS1 is responsible for issuing barcodes in Switzerland. Although everyone probably uses barcodes, at least as a consumer, there are always many customer inquiries regarding the registration or use of barcodes. GS1 introduced an LLM-based chatbot on its website at the beginning of 2024. The chatbot was implemented with the Microsoft Bot Framework and uses GPT as its language model.

In the first phase, the bot can only answer questions, but cannot run through any processes. For better traceability of the answers, the LLM bot always shows its sources and users can open the entire source directly via the chat if required.

ZüriCityGPT

ZüriCityGPT is, as the name suggests, a kind of ChatGPT that has been trained with all publicly available data from the city of Zurich. The chatbot’s interface is somewhat similar to that of ChatGPT and less like a classic chatbot. Furthermore, this bot is currently unable to understand follow-up questions or answer in-depth questions. The bot only ever answers one question, after which the user is required to ask the next question. The bot can be found via this separate landing page.

Hi the AI helper from Nachhaltigleben.ch

The sustainability magazine nachhaltigleben.ch offers its readers an LLM chatbot. The chatbot uses most of the information on the website and combines it with its language model. The LLM chatbot uses a conspicuous number of emojis and offers users follow-up questions matching its original question after each answer.

Below the input field there is a note that users should not enter any personal information in the chat and must assess the accuracy of the answer themselves.

Flurina from the Rhaetian Railway

The Rhaetian Railway launched its first ever chatbot in mid-2023. The chatbot uses the methods of LLMs and RAG right from the start. The LLM chatbot mainly accesses the content of the website. The language model used is that of OpenAI.

The chatbot is trilingual from the outset and asks its users which language they prefer at the start of the conversation. In principle, it would not be necessary to query the language when using LLMs. LLMs are language-independent, so to speak. However, it becomes challenging when the website is also trilingual and not one-to-one in every language. In this case, it is worth asking the user for their preferred language at the beginning so that the LLM bot can then select the correct data sources as the basis for its answers.

The chatbot is trilingual from the outset and asks its users which language they prefer at the start of the conversation. In principle, it would not be necessary to query the language when using LLMs. LLMs are language-independent, so to speak. However, it becomes challenging when the website is also trilingual and not one-to-one in every language. In this case, it is worth asking the user for their preferred language at the beginning so that the LLM bot can then select the correct data sources as the basis for its answers.

The LLM chatbot from BLT Baselland Transport AG

Baselland Transport AG has also had an LLM chatbot on its website since mid-2024. As of mid-2024, the chatbot works both rule-based and with the help of language models and generative AI. At the beginning of the dialog, the chatbot asks the user whether they would like to select a fixed topic or ask their own question. If the user chooses their own question, the language model and the stored knowledge database are used to generate the answer. Otherwise, the bot runs through a rule-based process. To ensure that there is no confusion between follow-up questions with or without contextual reference, the chatbot asks after each answer whether the user has a new or a follow-up question.

Sophia the digital assistant from Pro Senectute beider Basel

https://bb.pro-senectute.ch/sophia-digitale-beraterin/sophia.htmlProSenectute Basel published its first LLM chatbot in mid-2024. For the launch, the NGO did not integrate the chatbot directly into the classic website, but created a separate landing page with a subdomain for the LLM chatbot. Interested parties can use the LLM chatbot directly via the chat window on the right-hand side of the screen. The subpage also explains to users exactly what the chatbot can and cannot do and provides further information on data protection and other contact options. The subpage with the LLM chatbot of Pro Senectute Basel can be visited here.

Are you also interested in LLM bots? I advise many companies on the use of LLM chat and voicebot. I would be happy to accompany your LLM bot project and help you find the right technology. Of course, I can also take a look at your current chatbot and give you structured feedback.

Send me a message via WhatsApp or e-mail.

Book now
Your personal consultation

Do you need support or have questions? Then simply make an appointment with me and get a personal consultation. I look forward to hearing from you!

> Concept & Strategy

> Keynotes, workshops and expert contributions

> Chatbots, Voicebots, ChatGPT

Further contributions

Good content costs time...

... Sometimes time is money.

You can now pay a small amount to Sophie on a regular or one-off basis as a thank you for her work here (a little tip from me as Sophie’s AI Assistant).