tools parameter. When appropriate, the model will include a tool_calls array in its response, detailing which function to call and with what arguments. Your application can then parse this data, execute the corresponding function, and return the result to the model for further processing.
Example: A Multi-Turn Weather Bot
This example demonstrates a complete workflow where a model uses aget_weather function to retrieve real-time data. The process involves three main steps:
- First API Call: The user asks for the weather. The model receives the prompt and tool definitions and responds with a request to call the
get_weatherfunction with the appropriate arguments. - Tool Execution: Your application parses the model’s response, extracts the function name and arguments, and executes your local
get_weatherfunction. - Second API Call: Your application sends the result of the
get_weatherfunction back to the model, which then formulates a natural language response.
Complete Runnable Scripts
The following scripts for Python, cURL, and JavaScript demonstrate the entire flow.- cURL
- Python - OpenAI
- Python - Gravix SDK
- JavaScript
- JavaScript - Gravix SDK
Copy
# cURL Example for the Weather Bot
# This example demonstrates the API calls without actual function execution.
# First API Call - Simulate user asking for the weather
curl -X POST https://api.gravixlayer.com/v1/inference \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your_api_key_here" \
-d '{
"model": "meta-llama/llama-3.1-8b-instruct",
"messages": [{"role": "user", "content": "What'\''s the weather like in Paris, France today?"}],
"tools": [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get the current temperature in Celsius for a given latitude and longitude.",
"parameters": {
"type": "object",
"properties": {
"latitude": {"type": "number"},
"longitude": {"type": "number"}
},
"required": ["latitude", "longitude"]
}
}
}],
"tool_choice": "auto"
}'
# Second API Call - Simulate model responding with a function call
curl -X POST https://api.gravixlayer.com/v1/inference \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your_api_key_here" \
-d '{
"model": "meta-llama/llama-3.1-8b-instruct",
"messages": [
{"role": "user", "content": "What'\''s the weather like in Paris, France today?"},
{"role": "assistant", "content": "The current temperature in Paris is approximately 18.2 degrees Celsius."}
],
"tools": [],
"tool_choice": "auto"
}'
Copy
# Filename: weather_bot.py
# To run: python weather_bot.py
# Prerequisites: pip install openai requests
# Ensure API key is set: export GRAVIXLAYER_API_KEY='your-api-key'
import os
import json
import requests
from openai import OpenAI
# Step 1: Define the actual Python function that gets the weather
def get_weather(latitude, longitude):
"""Fetches the current temperature for given coordinates from an external API."""
print(f"--- Calling get_weather function for Lat: {latitude}, Lon: {longitude} ---")
url = f"https://api.open-meteo.com/v1/forecast?latitude={latitude}&longitude={longitude}¤t=temperature_2m"
response = requests.get(url)
response.raise_for_status()
data = response.json()
return data['current']['temperature_2m']
# Step 2: Set up the OpenAI client
api_key = os.environ.get("GRAVIXLAYER_API_KEY")
if not api_key:
raise ValueError("GRAVIXLAYER_API_KEY environment variable not set.")
client = OpenAI(
base_url="https://api.gravixlayer.com/v1/inference",
api_key=api_key
)
# Step 3: Define the tool schema for the model
tools = [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get the current temperature in Celsius for a given latitude and longitude.",
"parameters": {
"type": "object",
"properties": {
"latitude": {"type": "number", "description": "The latitude of the location."},
"longitude": {"type": "number", "description": "The longitude of the location."}
},
"required": ["latitude", "longitude"]
}
}
}]
# Step 4: Start the conversation
messages = [{"role": "user", "content": "What's the weather like in Paris, France today?"}]
print(f"User: {messages[-1]['content']}\n")
# Step 5: First API call to get the function call from the model
completion = client.chat.completions.create(
model="meta-llama/llama-3.1-8b-instruct",
messages=messages,
tools=tools,
tool_choice="auto"
)
response_message = completion.choices[0].message
messages.append(response_message)
# Step 6: Check if the model wants to call a tool
if hasattr(response_message, "tool_calls") and response_message.tool_calls:
tool_call = response_message.tool_calls[0]
function_name = tool_call.function.name
if function_name == "get_weather":
args = json.loads(tool_call.function.arguments)
temperature = get_weather(latitude=args["latitude"], longitude=args["longitude"])
# Add the tool result to the conversation
messages.append({
"role": "tool",
"tool_call_id": tool_call.id,
"name": function_name,
"content": str(temperature)
})
print(f"Tool Result (Temperature): {temperature}°C\n")
# Second API call to get a natural language response
final_completion = client.chat.completions.create(
model="meta-llama/llama-3.1-8b-instruct",
messages=messages,
tools=tools
)
final_response = final_completion.choices[0].message.content
print(f"\nFinal Assistant Response: {final_response}")
else:
print(f"Model requested unknown function: {function_name}")
else:
print(f"Final Assistant Response: {response_message.content}")
Copy
import os
import json
import requests
from gravixlayer import GravixLayer
def get_weather(latitude, longitude):
"""Fetches the current temperature for given coordinates from an external API."""
print(f"--- Calling get_weather function for Lat: {latitude}, Lon: {longitude} ---")
url = f"https://api.open-meteo.com/v1/forecast?latitude={latitude}&longitude={longitude}¤t=temperature_2m"
response = requests.get(url)
response.raise_for_status()
data = response.json()
return data['current']['temperature_2m']
# Make sure to export your API key in the environment
# export GRAVIXLAYER_API_KEY=your_api_key_here
client = GravixLayer()
tools = [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get the current temperature in Celsius for a given latitude and longitude.",
"parameters": {
"type": "object",
"properties": {
"latitude": {"type": "number", "description": "The latitude of the location."},
"longitude": {"type": "number", "description": "The longitude of the location."}
},
"required": ["latitude", "longitude"]
}
}
}]
messages = [{"role": "user", "content": "What's the weather like in Paris, France today?"}]
print(f"User: {messages[-1]['content']}\n")
# First API call to get the function call from the model
completion = client.chat.completions.create(
model="meta-llama/llama-3.1-8b-instruct",
messages=messages,
tools=tools,
tool_choice="auto"
)
response_message = completion.choices[0].message
messages.append(response_message)
# Check if the model wants to call a tool
if hasattr(response_message, "tool_calls") and response_message.tool_calls:
tool_call = response_message.tool_calls[0]
function_name = tool_call.function.name
if function_name == "get_weather":
args = json.loads(tool_call.function.arguments)
temperature = get_weather(latitude=args["latitude"], longitude=args["longitude"])
messages.append({
"role": "tool",
"tool_call_id": tool_call.id,
"name": function_name,
"content": str(temperature)
})
print(f"Tool Result (Temperature): {temperature}°C\n")
# Second API call to get a natural language response
final_completion = client.chat.completions.create(
model="meta-llama/llama-3.1-8b-instruct",
messages=messages,
tools=tools
)
final_response = final_completion.choices[0].message.content
print(f"\nFinal Assistant Response: {final_response}")
else:
print(f"Model requested unknown function: {function_name}")
else:
print(f"Final Assistant Response: {response_message.content}")
Copy
// Filename: weather_bot.js
// To run: node weather_bot.js
// Prerequisites: npm install node-fetch openai
// Ensure API key is set: export GRAVIXLAYER_API_KEY='your-api-key'
import fetch from 'node-fetch';
import { OpenAI } from 'openai';
async function get_weather(latitude, longitude) {
console.log(`--- Calling get_weather function for Lat: ${latitude}, Lon: ${longitude} ---`);
const url = `https://api.open-meteo.com/v1/forecast?latitude=${latitude}&longitude=${longitude}¤t=temperature_2m`;
const response = await fetch(url);
if (!response.ok) {
throw new Error(`HTTP error! status: ${response.status}`);
}
const data = await response.json();
return data.current.temperature_2m;
}
async function runConversation() {
const apiKey = process.env.GRAVIXLAYER_API_KEY;
if (!apiKey) {
throw new Error("GRAVIXLAYER_API_KEY environment variable not set.");
}
const client = new OpenAI({
baseURL: "https://api.gravixlayer.com/v1/inference",
apiKey: apiKey
});
const tools = [{
type: "function",
function: {
name: "get_weather",
description: "Get the current temperature in Celsius for a given latitude and longitude.",
parameters: {
type: "object",
properties: {
latitude: { type: "number" },
longitude: { type: "number" }
},
required: ["latitude", "longitude"]
}
}
}];
const messages = [{ role: "user", content: "What's the weather like in Paris, France today?" }];
console.log(`User: ${messages[messages.length - 1].content}\n`);
console.log("--- Making first API call to model ---");
const completion = await client.chat.completions.create({
model: "meta-llama/llama-3.1-8b-instruct",
messages: messages,
tools: tools,
tool_choice: "auto"
});
const responseMessage = completion.choices[0].message;
messages.push(responseMessage);
if (responseMessage.tool_calls) {
const toolCall = responseMessage.tool_calls[0];
const functionName = toolCall.function.name;
if (functionName === "get_weather") {
const args = JSON.parse(toolCall.function.arguments);
const temperature = await get_weather(args.latitude, args.longitude);
messages.push({
role: "tool",
tool_call_id: toolCall.id,
name: functionName,
content: String(temperature)
});
console.log(`Tool Result (Temperature): ${temperature}°C\n`);
console.log("--- Making second API call to model with tool result ---");
const finalCompletion = await client.chat.completions.create({
model: "meta-llama/llama-3.1-8b-instruct",
messages: messages,
tools: tools
});
console.log(`\nFinal Assistant Response: ${finalCompletion.choices[0].message.content}`);
}
} else {
console.log(`Final Assistant Response: ${responseMessage.content}`);
}
}
runConversation();
Copy
import fetch from "node-fetch";
import GravixLayer from "gravixlayer";
/**
* Fetch current temperature for given latitude & longitude
*/
async function get_weather(latitude, longitude) {
console.log(`--- Calling get_weather function for Lat: ${latitude}, Lon: ${longitude} ---`);
const url = `https://api.open-meteo.com/v1/forecast?latitude=${latitude}&longitude=${longitude}¤t=temperature_2m`;
const response = await fetch(url);
if (!response.ok) {
throw new Error(`HTTP error! status: ${response.status}`);
}
const data = await response.json();
return data.current.temperature_2m;
}
async function runConversation() {
const apiKey = process.env.GRAVIXLAYER_API_KEY;
const client = new GravixLayer({ apiKey });
const tools = [
{
type: "function",
function: {
name: "get_weather",
description: "Get the current temperature in Celsius for a given latitude and longitude.",
parameters: {
type: "object",
properties: {
latitude: { type: "number" },
longitude: { type: "number" },
},
required: ["latitude", "longitude"],
},
},
},
];
const messages = [
{ role: "user", content: "What's the weather like in Paris, France today?" },
];
console.log(`User: ${messages[0].content}\n`);
// First call to model (may trigger tool call)
console.log("--- Making first API call to model ---");
const completion = await client.chat.completions.create({
model: "meta-llama/llama-3.1-8b-instruct",
messages,
tools,
tool_choice: "auto",
});
const responseMessage = completion.choices[0].message;
// Ensure content is not null before pushing
messages.push({
role: "assistant",
content: responseMessage.content || "",
tool_calls: responseMessage.tool_calls,
});
if (responseMessage.tool_calls?.length > 0) {
const toolCall = responseMessage.tool_calls[0];
const functionName = toolCall.function.name;
if (functionName === "get_weather") {
const args = JSON.parse(toolCall.function.arguments);
const temperature = await get_weather(args.latitude, args.longitude);
console.log(`Tool Result (Temperature): ${temperature}°C\n`);
// Push tool output with the name property
messages.push({
role: "tool",
tool_call_id: toolCall.id,
name: functionName,
content: String(temperature),
});
console.log("--- Making second API call to model with tool result ---");
const finalCompletion = await client.chat.completions.create({
model: "meta-llama/llama-3.1-8b-instruct",
messages,
tools,
});
const finalMessage = finalCompletion.choices[0].message;
console.log(`\nFinal Assistant Response: ${finalMessage.content}`);
}
} else {
console.log(`Final Assistant Response: ${responseMessage.content}`);
}
}
runConversation().catch((err) => {
console.error("Error:", err);
});
Final Response
After the second API call, the model will produce a natural language response similar to this:Copy
The current temperature in Paris is approximately 18.2 degrees Celsius.
Function Schema Reference
Functions are defined in thetools parameter. Each function schema informs the model about the tool’s purpose and expected arguments.
| Field | Type | Description |
|---|---|---|
name | string | The exact name of the function to be called in your code (e.g., get_weather). Must consist of a-z, A-Z, 0-9, or _. |
description | string | A clear, detailed description of what the function does. This is crucial for helping the model decide when to use the tool. |
parameters | object | A standard JSON Schema object defining the function’s input arguments. |

