Configuring Base LLM
watsonx Assistant includes a range of pre-built AI models and algorithms that can be used to generate text, images, and other types of content. These models can be customized and combined in various ways to create more sophisticated and powerful generative AI applications.
The Base large language model (LLM) section in the Generative AI page helps you to configure large language models for your assistants. The LLMs enable your customers to interact with the assistants seamlessly without any custom-built conversational steps. You can enable the Base LLM features for the existing actions in your assistants that improve their conversation capability.
You can do the following actions in the Base LLM configuration:
Selecting a large language model for your assistant
To select the LLM that suits your enterprise ecosystem, do the following steps:
-
Go to Home > Generative AI.
-
In the Base large language model (LLM) section, select the large language model from the Select a model dropdown.
LLM description table
The following table shows the list of LLMs supported by the watsonx Assistant.
LLM model | Description |
---|---|
ibm/granite-13b-chat-v2 Withdrawn |
The Granite 13 Billion chat V2 (granite-13b-chat-v2 ) model is the chat-focused variant initialized from the pre-trained Granite Base 13 Billion Base V2 (granite-13b-base-v2 ) model. granite-13b-base-v2 is trained using over 2.5T tokens. IBM Generative AI Large Language Foundation Models are Enterprise-level English-language models trained with large a volume of data that is subjected to intensive pre-processing and careful analysis. |
ibm/granite-13b-instruct-v2 |
The Granite 13 Billion Instruct V2 (granite.13b.instruct.v2 ) model is the instruction-tuned variant initialized from the pre-trained Granite Base 13 Billion Base V2 (granite.13b.base.v2 ) model. granite.13b.base.v2 is trained using over 2.5T tokens. IBM Generative AI Large Language Foundation Models are Enterprise-level English-language models trained with large a volume of data that is subjected to intensive pre-processing and careful analysis. |
ibm/granite-3-8b-instruct |
Granite-3.0-8B-Instruct is a 8B parameter model finetuned from Granite-3.0-8B-Base using a combination of open source instruction datasets with permissive license and internally collected synthetic datasets. The model is designed to respond to general instructions and can be used to build assistants for multiple domains, including bussiness applications. |
ibm/granite-3-2b-instruct |
Granite-3.0-2B-Instruct is a lightweight and open-source 2B parameter model fine tuned from Granite-3.0-2B-Base on a combination of open-source and proprietary instruction data with permissive license. The model is designed to respond to general instructions and can be used to build assistants for multiple domains, including business applications. |
Adding prompt instructions
You can instruct the LLM in your assistant to give refined responses by adding prompt instructions. The prompt instructions help LLMs to guide the conversations with clarity and specificity to achieve the end goal of an action. Follow these steps to add the prompt instruction:
-
Go to Home > Generative AI.
-
In the Add prompt instructions section, click Add instructions to see the Prompt instruction field.
-
Enter the prompt instructions in the Prompt instruction field.
The maximum number of characters that you can enter in the Prompt instruction field is 1,000.
Selecting the answering behavior of your assistant
You can configure the answering behavior of your assistant to provide responses that are based on the preloaded content. The answering behavior that you can configure is:
- Conversational search
Only Plus or Enterprise plans support Conversational search. Starting from June 1, 2024, add-on charges are applicable for using the Conversational search feature in addition to your Plus or Enterprise plans. For more information about the pricing plans, see Pricing plans. For more information about terms, see Terms.
In conversational search, the LLM uses content that is preloaded during the search integration to respond to customer queries.
To use the conversational search, you must configure search integration and enable conversational search.
To use conversational search in any language other than English, you can change from the default model (Granite) to other model. You can test to verify that you are getting reasonable results in your language that can include Spanish, Brazilian Portuguese, French, or German.