• Supported service: llm
  • Key: anthropic
  • Integrated: Yes

Service options

model
string
default: "claude-3-5-sonnet-20241022"

The model that will complete your prompt. Supported models are:

  • claude-3-5-sonnet-20241022
  • claude-3-5-sonnet-20240620
  • claude-3-5-sonnet-latest
  • claude-3-5-haiku-20241022
  • claude-3-5-haiku-latest
{
  "service_options": {
    "anthropic": {
      "model": "claude-3-5-sonnet-latest"
    }
  }
}

Configuration options

model
string
default: "claude-3-5-sonnet-20241022"

The model that will complete your prompt. Supported models are:

  • claude-3-5-sonnet-20241022
  • claude-3-5-sonnet-20240620
  • claude-3-5-sonnet-latest
  • claude-3-5-haiku-20241022
  • claude-3-5-haiku-latest
{
  "name": "model",
  "value": "claude-3-5-sonnet-latest"
}
max_tokens
integer
default: "4096"

The maximum number of tokens to generate before stopping.

{
  "name": "max_tokens",
  "value": 2048
}
temperature
float
default: "1.0"

Amount of randomness injected into the response.

Ranges from 0.0 to 1.0. Use temperature closer to 0.0 for analytical / multiple choice, and closer to 1.0 for creative and generative tasks.

Note that even with temperature of 0.0, the results will not be fully deterministic.

{
  "name": "temperature",
  "value": 0.9
}
top_k
integer

Only sample from the top K options for each subsequent token.

Used to remove “long tail” low probability responses. Learn more technical details here.

Recommended for advanced use cases only. You usually only need to use temperature.

{
  "name": "top_k",
  "value": 42
}
top_p
float

Use nucleus sampling.

In nucleus sampling, we compute the cumulative distribution over all the options for each subsequent token in decreasing probability order and cut it off once it reaches a particular probability specified by top_p. You should either alter temperature or top_p, but not both.

Recommended for advanced use cases only. You usually only need to use temperature.

{
  "name": "top_p",
  "value": 0.5
}
extra
object

A dictionary that can contain any additional parameters supported by Anthropic that you want to pass to the API. Refer to the Anthropic docs for more information on each of these configuration options.

{
  "name": "extra",
  "value": {
    "metadata": "Completion containing a user request",
    "stop_sequences": ["Stop", "Goodbye", "See you later"]
  }
}

Function Calling

Anthropic’s function calling documentation is located here.

export const defaultConfig = [
  {
    service: "llm",
    options: [
      {
        name: "initial_messages",
        value: [
          {
            role: "user",
            content:
              "You are a TV weatherman named Dallas Storms. Your job is to present the weather to me. You can call the 'get_weather' function to get weather information. Start by asking me for my location. Then, use 'get_weather' to give me a forecast. Then, answer any questions I have about the weather. Keep your introduction and responses very brief. You don't need to tell me if you're going to call a function; just do it directly. Keep your words to a minimum. When you're delivering the forecast, you can use more words and personality.",
          },
        ],
      },
      { name: "run_on_config", value: true },
      {
        name: "tools",
        value: [
          {
            name: "get_weather",
            description:
              "Get the current weather for a location. This includes the conditions as well as the temperature.",
            input_schema: {
              type: "object",
              properties: {
                location: {
                  type: "string",
                  description:
                    "The user's location in the form 'city,state,country'. For example, if the user is in Austin, TX, use 'austin,tx,us'.",
                },
                format: {
                  type: "string",
                  enum: ["celsius", "fahrenheit"],
                  description:
                    "The temperature unit to use. Infer this from the user's location.",
                },
              },
              required: ["location", "format"],
            },
          },
        ],
      },
    ],
  },
];

For more info on how to use function calling in Daily Bots, take a look at the tutorial page.