Gravix Layer x MongoDB Integration
Integrate Gravix Layer's LLMs with MongoDB to store, query, and analyze AI-generated data in a scalable NoSQL database.
What You'll Learn
- How to save LLM outputs to MongoDB
- How to query and analyze AI-generated data
- Example: Storing chat completions in MongoDB
1. Install Required Packages
pip install pymongo openai python-dotenv
2. Configure Your API Key
Add your API key to a .env
file:
GRAVIXLAYER_API_KEY=your_api_key_here
3. Using MongoDB with Gravix Layer
from pymongo import MongoClient
from openai import OpenAI
import os
from dotenv import load_dotenv
load_dotenv()
api_key = os.environ.get("GRAVIXLAYER_API_KEY", "test_key")
llm = OpenAI(
api_key=api_key,
base_url="https://api.gravixlayer.com/v1/inference"
)
client = MongoClient("mongodb://localhost:27017/")
db = client["ai_data"]
collection = db["completions"]
response = llm.chat.completions.create(
model="meta-llama/llama-3.1-8b-instruct",
messages=[{"role": "user", "content": "Summarize this: AI is changing the world."}]
)
collection.insert_one({"prompt": "Summarize this: AI is changing the world.", "response": response.choices[0].message.content})
print("Saved to MongoDB!")
Expected Output:
Saved to MongoDB!
If MongoDB is not running, you may see:
pymongo.errors.ServerSelectionTimeoutError: localhost:27017: [Errno 61] Connection refused ...
Sample Output:
pymongo.errors.ServerSelectionTimeoutError: localhost:27017: [Errno 61] Connection refused ...
Note: You must have a running MongoDB server on
localhost:27017
for this code to work.
MongoDB lets you store and analyze AI-generated data at scale. Gravix Layer provides the LLM power for any data-driven application.