Function Calling
Introduction
Some models support function calling (also known as tool calling), which lets them return structured function names and arguments in response to a prompt—ideal for integrating dynamic behaviors into your application.
To use this feature, define your available functions and pass them as an array to the tools
parameter. When appropriate, the model will include a tool_calls
array in its response, detailing which functions to call and with what arguments.
You can then parse the tool_calls
data, invoke the corresponding functions in your code, and either return the results to the user or feed them into a follow-up LLM call for further processing.
Supported Models
The following models currently support Function Calling:
llama3.1:8b-instruct-fp16
Full Example: A Multi-Turn Weather Bot
Let's walk through a complete example of how a model can use a real get_weather
function. The process involves three main steps:
- First API Call: The user asks for the weather. The model receives the user's prompt and the available tool definitions, and responds with a request to call the
get_weather
function with the correct arguments. - Execute Tool: Your code parses the model's response, extracts the function name and arguments, and executes your local
get_weather
function. - Second API Call: Your code sends the result of the
get_weather
function back to the model. The model then uses this information to formulate a natural, human-readable answer.
Complete Runnable Scripts
Here are the complete, runnable scripts for Python, cURL, and JavaScript that demonstrate this entire flow.
- Python
- JavaScript
# Filename: weather_bot.py
# To run: python weather_bot.py
# Prerequisites: pip install openai requests
# Ensure API key is set: export GRAVIXLAYER_API_KEY='your-api-key'
import os
import json
import requests
from openai import OpenAI
# Step 1: Define the actual Python function that gets the weather
def get_weather(latitude, longitude):
"""Fetches the current temperature for given coordinates from an external API."""
print(f"--- Calling get_weather function for Lat: {latitude}, Lon: {longitude} ---")
url = f"https://api.open-meteo.com/v1/forecast?latitude={latitude}&longitude={longitude}¤t=temperature_2m"
response = requests.get(url)
response.raise_for_status() # Raise an exception for bad status codes
data = response.json()
return data['current']['temperature_2m']
def run_conversation():
api_key = os.environ.get("GRAVIXLAYER_API_KEY")
if not api_key:
raise ValueError("GRAVIXLAYER_API_KEY environment variable not set.")
client = OpenAI(base_url="https://api.gravixlayer.com/v1/inference", api_key=api_key)
# Step 2: Define the function schema for the model
tools = [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get the current temperature in Celsius for a given latitude and longitude.",
"parameters": {
"type": "object",
"properties": {
"latitude": {"type": "number", "description": "The latitude of the location."},
"longitude": {"type": "number", "description": "The longitude of the location."}
},
"required": ["latitude", "longitude"]
}
}
}]
# Start the conversation with the user's question
messages = [{"role": "user", "content": "What's the weather like in Paris, France today?"}]
print(f"User: {messages[-1]['content']}\n")
# Step 3: First API call to get the function call from the model
print("--- Making first API call to model ---")
completion = client.chat.completions.create(
model="llama3.1:8b-instruct-fp16",
messages=messages,
tools=tools,
tool_choice="auto"
)
response_message = completion.choices[0].message
messages.append(response_message) # Append the assistant's message
# Check if the model wants to call a tool
if response_message.tool_calls:
tool_call = response_message.tool_calls[0]
function_name = tool_call.function.name
if function_name == "get_weather":
# Step 4: Execute the local function with arguments from the model
args = json.loads(tool_call.function.arguments)
temperature = get_weather(latitude=args["latitude"], longitude=args["longitude"])
# Step 5: Append the tool's result to the conversation history
messages.append({
"role": "tool",
"tool_call_id": tool_call.id,
"name": function_name,
"content": str(temperature)
})
print(f"Tool Result (Temperature): {temperature}°C\n")
# Step 6: Second API call to get a natural language response
print("--- Making second API call to model with tool result ---")
final_completion = client.chat.completions.create(
model="llama3.1:8b-instruct-fp16",
messages=messages,
tools=tools,
)
final_response = final_completion.choices[0].message.content
print(f"\nFinal Assistant Response: {final_response}")
else:
print(f"Model requested unknown function: {function_name}")
else:
# If the model replies directly without a tool call
print(f"Final Assistant Response: {response_message.content}")
if __name__ == "__main__":
run_conversation()
// Filename: weather_bot.js
// To run: node weather_bot.js
// Prerequisites: npm install node-fetch openai
// Ensure API key is set: export GRAVIXLAYER_API_KEY='your-api-key'
import fetch from 'node-fetch';
import { OpenAI } from 'openai';
async function get_weather(latitude, longitude) {
console.log(`--- Calling get_weather function for Lat: ${latitude}, Lon: ${longitude} ---`);
const url = `https://api.open-meteo.com/v1/forecast?latitude=${latitude}&longitude=${longitude}¤t=temperature_2m`;
const response = await fetch(url);
if (!response.ok) {
throw new Error(`HTTP error! status: ${response.status}`);
}
const data = await response.json();
return data.current.temperature_2m;
}
async function runConversation() {
const apiKey = process.env.GRAVIXLAYER_API_KEY;
if (!apiKey) {
throw new Error("GRAVIXLAYER_API_KEY environment variable not set.");
}
const client = new OpenAI({ baseURL: "https://api.gravixlayer.com/v1/inference", apiKey: apiKey });
const tools = [{
type: "function",
function: {
name: "get_weather",
description: "Get the current temperature in Celsius for a given latitude and longitude.",
parameters: {
type: "object",
properties: {
latitude: { type: "number" },
longitude: { type: "number" }
},
required: ["latitude", "longitude"]
}
}
}];
const messages = [{ role: "user", content: "What's the weather like in Paris, France today?" }];
console.log(`User: ${messages[messages.length - 1].content}\n`);
console.log("--- Making first API call to model ---");
const completion = await client.chat.completions.create({
model: "llama3.1:8b-instruct-fp16",
messages: messages,
tools: tools,
tool_choice: "auto"
});
const responseMessage = completion.choices[0].message;
messages.push(responseMessage);
if (responseMessage.tool_calls) {
const toolCall = responseMessage.tool_calls[0];
const functionName = toolCall.function.name;
if (functionName === "get_weather") {
const args = JSON.parse(toolCall.function.arguments);
const temperature = await get_weather(args.latitude, args.longitude);
messages.push({
role: "tool",
tool_call_id: toolCall.id,
name: functionName,
content: String(temperature)
});
console.log(`Tool Result (Temperature): ${temperature}°C\n`);
console.log("--- Making second API call to model with tool result ---");
const finalCompletion = await client.chat.completions.create({
model: "llama3.1:8b-instruct-fp16",
messages: messages,
tools: tools
});
console.log(`\nFinal Assistant Response: ${finalCompletion.choices[0].message.content}`);
}
} else {
console.log(`Final Assistant Response: ${responseMessage.content}`);
}
}
runConversation();
Final Response
After the second API call, the model will produce a natural language response like this:
The current temperature in Paris is approximately 18.2 degrees Celsius.
Function Schema Reference
Functions are defined in the tools
parameter. Each function schema informs the model what the tool does and what arguments it expects.
Field | Type | Description |
---|---|---|
name | string | The exact name of the function to be called in your code (e.g., get_weather ). Must be a-z, A-Z, 0-9, or _ . |
description | string | A clear, detailed description of what the function does. This is crucial for helping the model decide when to use it. |
parameters | object | A standard JSON Schema object defining the function's input arguments. |