Skip to main content

GravixLayer x Langchain Integration

Welcome! This guide will walk you through integrating the GravixLayer API with LangChain.

What You'll Learn

  • How to configure your environment and securely manage your API key
  • How to use LangChain's ChatOpenAI interface with GravixLayer's OpenAI-compatible endpoints
  • How to build advanced prompt workflows and conversational AI agents

Who Should Use This Guide?

This resource is ideal for developers, data scientists, and AI practitioners seeking to combine the flexibility of LangChain with the performance of GravixLayer's large language models for applications such as question answering, chatbots, and more.


1. Install Required Packages

Make sure you have the following Python packages installed:

pip install openai langchain langchain-openai python-dotenv

2. Configure Your API Key

Store your GravixLayer API key in a .env file:

GRAVIXLAYER_API_KEY=your_api_key_here

Then, load it in your Python code:

import os
from dotenv import load_dotenv

load_dotenv() # Loads variables from .env into environment

api_key = os.environ.get("GRAVIXLAYER_API_KEY")
if not api_key:
raise ValueError("GRAVIXLAYER_API_KEY not found. Please add it to your .env file.")

print(f"API Key configured: {'Yes' if api_key else 'No'}")

3. Using LangChain with GravixLayer

You can use LangChain's ChatOpenAI interface with GravixLayer's OpenAI-compatible endpoints:

from langchain_openai import ChatOpenAI
from langchain.prompts import ChatPromptTemplate
from langchain.schema.output_parser import StrOutputParser

# Initialize LangChain model
llm = ChatOpenAI(
model="llama3.1:8b",
openai_api_key=os.environ.get("GRAVIXLAYER_API_KEY"),
openai_api_base="https://api.gravixlayer.com/v1/inference",
temperature=0.7
)

# Create prompt template and chain
prompt = ChatPromptTemplate.from_messages([
("system", "You are an expert in {topic}. Provide clear, concise explanations."),
("human", "{question}")
])

chain = prompt | llm | StrOutputParser()

# Use the chain
result = chain.invoke({
"topic": "artificial intelligence",
"question": "What are neural networks?"
})

print("LangChain Response:")
print(result)

4. Building a Simple Chatbot with Conversation Memory

You can build a conversational agent with memory using LangChain:

from langchain.memory import ConversationBufferMemory
from langchain.chains import ConversationChain

class SimpleChatbot:
def __init__(self):
self.llm = ChatOpenAI(
model="llama3.1:8b",
openai_api_key=os.environ.get("GRAVIXLAYER_API_KEY"),
openai_api_base="https://api.gravixlayer.com/v1/inference",
temperature=0.7
)
self.memory = ConversationBufferMemory()
self.conversation = ConversationChain(llm=self.llm, memory=self.memory)

def chat(self, message):
return self.conversation.predict(input=message)

# Test the chatbot
bot = SimpleChatbot()

print("Chatbot Test:")
response1 = bot.chat("Hi, I'm learning about Python programming.")
print(f"User: Hi, I'm learning about Python programming.")
print(f"Bot: {response1}")

response2 = bot.chat("What should I learn first?")
print(f"\nUser: What should I learn first?")
print(f"Bot: {response2}")

Summary

This guide demonstrates how to use the GravixLayer API, focusing on LangChain integration for advanced prompt engineering and conversational AI.

Key highlights:

  1. Setup: Installing packages and loading your API key from a .env file.
  2. LangChain Integration:
    • Using ChatOpenAI with custom prompt templates
    • Chaining prompts and outputs
    • Building a chatbot with conversation memory

Why LangChain?

LangChain enables:

  • Flexible prompt templates
  • Output parsing and chaining
  • Conversation memory and advanced workflows
  • Seamless integration with OpenAI-compatible endpoints like GravixLayer

Key Configuration

  • Base URL: https://api.gravixlayer.com/v1/inference
  • API Key: Loaded from GRAVIXLAYER_API_KEY in your .env file

This guide is a practical resource for integrating LangChain with GravixLayer’s OpenAI-compatible API for your next AI project.