Groq
- Supported service:
llm
- Key:
groq
- Integrated: No. See BYO Keys for more details.
Service options
The model that will complete your prompt. Available models can be found here.
Configuration options
The model that will complete your prompt. Available models can be found here.
A dictionary that can contain any additional parameters supported by Groq that you want to pass to the API. Refer to the Groq docs for more information on each of these configuration options.
Function calling
Many of Groq’s models, including their versions of Llama 3 and 3.1, support function calling using the OpenAI interface. For examples of how to use that approach, see the OpenAI service page.
Alternatively, Groq also supports Llama 3.1’s function calling interface. Llama 3.1’s function calling documentation is located here. Llama doesn’t use OpenAI-style function calling; instead, you need to add to your system prompt to enable the LLM to use functions. If you use the following recommended format, Daily Bots will be able to detect function call responses from the LLM and allow you to follow the directions in the function calling tutorial.
First, define a weatherTool
object, and/or any other functions you want to call:
Then, reference that weatherTool
object in your system prompt: