Gravix Layer x LangGraph Integration
Integrate Gravix Layer's LLMs with LangGraph to build dynamic, graph-based AI workflows for advanced reasoning and automation.
What You'll Learn
- How to connect LangGraph nodes to Gravix Layer's API
- How to use OpenAI-compatible endpoints for graph-based workflows
- Example: Using LangGraph for multi-step document analysis
1. Install Required Packages
pip install langgraph openai python-dotenv
2. Configure Your API Key
Add your API key to a .env
file:
GRAVIXLAYER_API_KEY=your_api_key_here
3. Using LangGraph with Gravix Layer
from openai import OpenAI
import os
from dotenv import load_dotenv
load_dotenv()
api_key = os.environ.get("GRAVIXLAYER_API_KEY", "test_key")
llm = OpenAI(
api_key=api_key,
base_url="https://api.gravixlayer.com/v1/inference"
)
response = llm.chat.completions.create(
model="meta-llama/llama-3.1-8b-instruct",
messages=[{"role": "user", "content": "Analyze this document: ..."}]
)
print(response.choices[0].message.content)
Expected Output:
This document discusses ... (sample output will depend on the actual document content provided)
Note: The LangGraph API has changed and the original example may not work. The above code uses the OpenAI client directly for document analysis, which is always compatible with Gravix Layer.