Skip to main content
The GravixLayer SDK provides comprehensive chat completion capabilities, allowing you to build conversational AI applications with simple, practical examples using CLI, Python, and JavaScript interfaces. Chat completions allow you to:
  • Generate Responses: Create intelligent responses to user messages
  • Multi-turn Conversations: Maintain context across conversation turns
  • Model Selection: Choose from various language models for different use cases
  • Streaming Support: Get real-time responses as they’re generated

Prerequisites

Before using chat completions, you need to set up your environment: Install the GravixLayer SDK:
pip install gravixlayer
Set your API key as an environment variable:
set GRAVIXLAYER_API_KEY="your_api_key_here"

Your First Request

gravixlayer chat --model "meta-llama/llama-3.1-8b-instruct" --user "Hello, world!"
Example Output:
Hello! How can I assist you today?