Skip to main content

Gravix Layer x Langchain Integration

Welcome! This guide will walk you through integrating the Gravix Layer API with LangChain.

What You'll Learn

  • How to configure your environment and securely manage your API key
  • How to use LangChain's ChatOpenAI interface with Gravix Layer's OpenAI-compatible endpoints
  • How to build advanced prompt workflows and conversational AI agents

Who Should Use This Guide?

This resource is ideal for developers, data scientists, and AI practitioners seeking to combine the flexibility of LangChain with the performance of Gravix Layer's large language models for applications such as question answering, chatbots, and more.


1. Install Required Packages

Make sure you have the following Python packages installed:

pip install openai langchain langchain-openai python-dotenv

2. Configure Your API Key

Store your Gravix Layer API key in a .env file:

GRAVIXLAYER_API_KEY=your_api_key_here

Then, load it in your Python code:

import os
from dotenv import load_dotenv

load_dotenv() # Loads variables from .env into environment

api_key = os.environ.get("GRAVIXLAYER_API_KEY")
if not api_key:
raise ValueError("GRAVIXLAYER_API_KEY not found. Please add it to your .env file.")
else:
print("API Key configured: Yes")

Expected Output:

API Key configured: Yes

3. Using LangChain with GravixLayer

You can use LangChain's ChatOpenAI interface with Gravix Layer's OpenAI-compatible endpoints:

from langchain_openai import ChatOpenAI
from langchain.prompts import ChatPromptTemplate
from langchain.schema.output_parser import StrOutputParser

# Initialize LangChain model
llm = ChatOpenAI(
model="meta-llama/llama-3.1-8b-instruct",
openai_api_key=os.environ.get("GRAVIXLAYER_API_KEY"),
openai_api_base="https://api.gravixlayer.com/v1/inference",
temperature=0.7
)

# Create prompt template and chain
prompt = ChatPromptTemplate.from_messages([
("system", "You are an expert in {topic}. Provide clear, concise explanations."),
("human", "{question}")
])

chain = prompt | llm | StrOutputParser()

# Use the chain
result = chain.invoke({
"topic": "artificial intelligence",
"question": "What are neural networks?"
})

print("LangChain Response:")
print(result)

Expected Output:

LangChain Response:
Neural networks are computational models inspired by the human brain, consisting of interconnected nodes (neurons) that process information in layers. They are widely used in machine learning for tasks such as image recognition, natural language processing, and more, due to their ability to learn complex patterns from data.

4. Building a Simple Chatbot with Conversation Memory

You can build a conversational agent with memory using LangChain:

from langchain.memory import ConversationBufferMemory
from langchain.chains import ConversationChain

class SimpleChatbot:
def __init__(self):
self.llm = ChatOpenAI(
model="meta-llama/llama-3.1-8b-instruct",
openai_api_key=os.environ.get("GRAVIXLAYER_API_KEY"),
openai_api_base="https://api.gravixlayer.com/v1/inference",
temperature=0.7
)
self.memory = ConversationBufferMemory()
self.conversation = ConversationChain(llm=self.llm, memory=self.memory)

def chat(self, message):
return self.conversation.predict(input=message)

# Test the chatbot
bot = SimpleChatbot()

print("Chatbot Test:")
response1 = bot.chat("Hi, I'm learning about Python programming.")
print(f"User: Hi, I'm learning about Python programming.")
print(f"Bot: {response1}")

response2 = bot.chat("What should I learn first?")
print(f"\nUser: What should I learn first?")
print(f"Bot: {response2}")

Expected Output:

Chatbot Test:
User: Hi, I'm learning about Python programming.
Bot: That's great! Python is a versatile and beginner-friendly language. How can I assist you in your learning journey?

User: What should I learn first?
Bot: You should start by learning Python basics such as variables, data types, control structures (if statements, loops), and functions. Once comfortable, you can explore modules, file handling, and object-oriented programming.

Summary

This guide demonstrates how to use the Gravix Layer API, focusing on LangChain integration for advanced prompt engineering and conversational AI.

Key highlights:

  1. Setup: Installing packages and loading your API key from a .env file.
  2. LangChain Integration:
    • Using ChatOpenAI with custom prompt templates
    • Chaining prompts and outputs
    • Building a chatbot with conversation memory

Why LangChain?

LangChain enables:

  • Flexible prompt templates
  • Output parsing and chaining
  • Conversation memory and advanced workflows
  • Seamless integration with OpenAI-compatible endpoints like Gravix Layer

Key Configuration

  • Base URL: https://api.gravixlayer.com/v1/inference
  • API Key: Loaded from GRAVIXLAYER_API_KEY in your .env file

This guide is a practical resource for integrating LangChain with Gravix Layer’s OpenAI-compatible API for your next AI project.