• Supported service: llm
  • Key: gemini
  • Integrated: No. See BYO Keys for more details.

Service options

model
string
default: "gemini-1.5-flash-latest"

The model that will complete your prompt. See the available Gemini models here.

{
  "service_options": {
    "gemini": {
      "model": "gemini-1.5-pro-latest"
    }
  }
}

Configuration options

model
string
default: "gemini-1.5-flash-latest"

The model that will complete your prompt. See the available Gemini models here.

{
  "name": "model",
  "value": "gemini-1.5-flash-latest"
}
max_tokens
integer
default: "4096"

The maximum number of tokens to generate before stopping. See the Gemini docs for more information.

{
  "name": "max_tokens",
  "value": 2048
}
temperature
float
default: "1.0"

Amount of randomness injected into the response.

Use temperature closer to the lower end of the range for analytical / multiple choice, and closer to the high end of the range for creative and generative tasks.

Note that even with temperature of 0.0, the results will not be fully deterministic.

See the Gemini docs for range information for each model.

{
  "name": "temperature",
  "value": 0.9
}
top_k
integer

Only sample from the top K options for each subsequent token.

Used to remove “long tail” low probability responses. Learn more technical details here.

Recommended for advanced use cases only. You usually only need to use temperature.

{
  "name": "top_k",
  "value": 42
}
top_p
float

Use nucleus sampling.

In nucleus sampling, we compute the cumulative distribution over all the options for each subsequent token in decreasing probability order and cut it off once it reaches a particular probability specified by top_p. You should either alter temperature or top_p, but not both. See the Gemini docs for more information.

Recommended for advanced use cases only. You usually only need to use temperature.

{
  "name": "top_p",
  "value": 0.5
}
extra
object

A dictionary that can contain any additional parameters supported by Gemini that you want to pass to the API. Refer to the Gemini reference docs for more information on each of these configuration options.

{
  "name": "extra",
  "value": {
    "temperature": 1.2,
    "max_tokens": 4000
  }
}

Function Calling

Gemini’s function calling documentation is located here.

export const defaultConfig = [
  {
    service: "llm",
    options: [
      {
        name: "initial_messages",
        value: [
          {
            role: "system",
            content:
              "You are a TV weatherman named Dallas Storms. Your job is to present the weather to me. Start by asking me for my location. Then, use 'get_weather_current' to give me a forecast. If the user asks for a forecast, use 'get_weather_forecast' to give them a forecast. Then, answer any questions I have about the weather. Keep your introduction and responses very brief. You don't need to tell me if you're going to call a function; just do it directly. Keep your words to a minimum. When you're delivering the forecast, you can use more words and personality. Your responses will be converted to audio.",
          },
        ],
      },
      {
        name: "run_on_config",
        value: true,
      },
      {
        name: "tools",
        value: {
          function_declarations: [
            {
              name: "get_weather_current",
              description:
                "Get the current weather for a location. This includes the conditions as well as the temperature.",
              parameters: {
                type: "object",
                properties: {
                  location: {
                    type: "string",
                    description:
                      "The user's location in the form 'city,state,country'. For example, if the user is in Austin, TX, use 'austin,tx,us'.",
                  },
                  format: {
                    type: "string",
                    enum: ["celsius", "fahrenheit"],
                    description:
                      "The temperature unit to use. Infer this from the user's location.",
                  },
                },
                required: ["location", "format"],
              },
            },
          ],
        },
      },
    ],
  },
];

For more info on how to use function calling in Daily Bots, take a look at the tutorial page.