Daily Bots makes it easy to build function calling (also known as ‘tool calling’) into your app.

Understanding function calling

It’s helpful to start with an understanding of what function calling is. OpenAI, Anthropic, and Llama 3.1 all have good documentation on the topic, but here’s a quick overview.

Say you want to give your bot the ability to tell your users about the current weather. That information obviously can’t be trained into the LLM, so you’ll need to get it directly from a weather API. But you can actually use the LLM to help you with this. The workflow looks like this:

  1. When your app starts, it provides the LLM with information about functions, or tools, the LLM can choose to use. This is typically sent with the initial system prompt, but it can be updated at any point in the bot session.
  2. The user asks a question that the LLM decides it needs to use a function to answer. Instead of responding with normal text, the LLM returns a function call describing how to use a function to supplement its knowledge.
  3. The LLM can’t actually call the function itself, so your app takes the information from the bot’s function call response, and, well, actually calls the function. Your app saves the result as a function result.
  4. Your app appends both the function call and function result to the LLM’s message history, and prompts the LLM to run another completion. The LLM sees the function call and result in the message history, and it realizes it now has the info it needs to answer the user’s question, so it generates a text response.

Adding it to your bot

Daily Bots manages most of this for you. You’ll need to do two things.

First, define the function(s) you want the bot to be able to call. The specific format of this object will vary depending on which LLM you’re using, but here are examples of a get_weather function for Anthropic, OpenAI, and Llama 3.1:

rtvi.config.ts
export const defaultConfig = [
  {
    service: "llm",
    options: [
      { name: "model", value: "claude-3-5-sonnet-20240620" },
      {
        name: "initial_messages",
        value: [
          {
            role: "system",
            content: "You are a TV weatherman named Wally (...)",
          },
        ],
      },
      {
        name: "tools",
        value: [
          {
            type: "function",
            function: {
              name: "get_current_weather",
              description:
                "Get the current weather for a location. This includes the conditions as well as the temperature.",
              input_schema: {
                type: "object",
                properties: {
                  location: {
                    type: "string",
                    description: "The city and state, e.g. San Francisco, CA",
                  },
                  format: {
                    type: "string",
                    enum: ["celsius", "fahrenheit"],
                    description:
                      "The temperature unit to use. Infer this from the users location.",
                  },
                },
                required: ["location", "format"],
              },
            },
          },
        ],
      },
    ],
  },
];

Daily Bots detects when the LLM returns a function call and passes that to your app. So you’ll need to register a handler using handleFunctionCall. The best place to do that is in app/page.tsx, right after the llmHelper is created:

app/page.tsx
llmHelper.handleFunctionCall(async (fn: FunctionCallParams) => {
  const args = fn.arguments as any;
  if (fn.functionName === "get_weather" && args.location) {
    const response = await fetch(
      `/api/weather?location=${encodeURIComponent(args.location)}`
    );
    const json = await response.json();
    setFetchingWeather(false);
    return json;
  } else {
    setFetchingWeather(false);
    return { error: "couldn't fetch weather" };
  }
});

If your handler returns anything other than null, it will be treated as the function result from the examples above. Daily Bots will add the function call and result to the bot’s messages array (the “Tool Call” and “Tool Response” in the example above), and then re-prompt the LLM to generate a voice response from the bot.

If you return null from your handler, Daily Bots will essentially ignore the function call. This can be useful for enabling the LLM to send various ‘signals’ to your app when parts of a conversation are complete, for example. More documentation on this behavior is coming soon.