The GravixLayer Memory System lets you store and search information for AI applications. Think of it as a smart database that remembers user preferences and conversations.
Public Preview: Gravix Layer is currently in Public preview. Features are experimental and may have issues or break as ongoing updates to API endpoints and models continue.
What is Memory?
Memory stores information about users and conversations:
- User preferences (“I love pizza”)
- Conversation history
- Facts and settings
- Automatically extract insights from chats
AI Processing with infer Parameter
Simple Text: Stored exactly as provided
memory.add("I love pizza", user_id="alice") # Stored as: "I love pizza"
Conversations: AI extracts meaningful information from messages when infer=True (default)
messages = [
{"role": "user", "content": "I'm looking for a restaurant"},
{"role": "assistant", "content": "What cuisine do you prefer?"},
{"role": "user", "content": "I love spicy Thai food"}
]
# AI analyzes the conversation and extracts: "User prefers spicy Thai cuisine"
memory.add(messages, user_id="alice") # infer=True by default
Raw Storage (infer=False):
- Stores messages without extracting meaningful information
- No AI analysis, faster processing
- Preserves exact conversation content
QuickStart
Python SDK
JavaScript SDK
Core Operations
Quick Examples
Basic Example
Python SDK
JavaScript SDK
from gravixlayer import GravixLayer
client = GravixLayer()
memory = client.memory
# Simple text - stored as-is
result = memory.add("I love pizza", user_id="alice")
print(f"Stored: {result['results'][0]['memory']}")
# Conversation - AI extracts insights
conversation = [
{"role": "user", "content": "I'm looking for a restaurant"},
{"role": "user", "content": "I love spicy Thai food"}
]
conv_result = memory.add(conversation, user_id="alice") # infer=True by default
print(f"AI extracted: {conv_result['results'][0]['memory']}")
# Search finds both
results = memory.search("food", user_id="alice")
print(f"Found {len(results['results'])} memories")
Expected Output:Stored: I love pizza
AI extracted: User prefers spicy Thai cuisine
Found 2 memories
Custom Configuration (Optional)
The memory system works great with defaults, but you can customize it for specific needs:
Configuration Parameters
The memory system has 4 main settings:
embedding_model - Converts text to searchable format
inference_model - AI that extracts memories from conversations
index_name - Where memories are stored (like folders)
cloud_provider - Where data is hosted
Configuration Example
Python SDK
JavaScript SDK
from gravixlayer import GravixLayer
client = GravixLayer()
memory = client.memory
# Configure for multilingual app
memory.switch_configuration(
embedding_model="microsoft/multilingual-e5-large",
index_name="user_preferences",
cloud_provider="GCP",
region="us-east1"
)
# Add multilingual memories
memory.add("El usuario prefiere pizza", user_id="alice")
memory.add("L'utilisateur aime le café", user_id="alice")
# Search works across languages
results = memory.search("food preferences", user_id="alice")
print(f"Found {len(results['results'])} memories")
Expected Output:🔄 Switched embedding model to: microsoft/multilingual-e5-large
🔄 Switched to database: user_preferences
🔄 Switched cloud provider to: GCP
🔄 Switched region to: us-east1
✅ Configuration updated successfully
Found 2 memories
Available Embedding Models
| Model | Languages |
|---|
baai/bge-large-en-v1.5 | English |
microsoft/multilingual-e5-large | 100+ |
nomic-ai/nomic-embed-text-v1.5 | English |
Available Cloud Providers
| Provider | Regions | Best For |
|---|
AWS | us-east-1, | |
GCP | us-east1, | |
Gravix | gl-eu-west1, | |
Setup
Set your API key first:
Windows (CMD)
Windows (PowerShell)
Linux/macOS
set GRAVIXLAYER_API_KEY=your_api_key_here
Async Support
GravixLayer supports async/await for Python applications:
Basic Async Example
from gravixlayer import AsyncGravixLayer
async def main():
client = AsyncGravixLayer()
memory = client.memory
# Add memory
result = await memory.add("I love pizza", user_id="alice")
print(f"Added: {result['results'][0]['memory']}")
# Search memories
results = await memory.search("food", user_id="alice")
print(f"Found: {results['results'][0]['memory']}")
Async with Configuration
from gravixlayer import AsyncGravixLayer
async def async_config_example():
client = AsyncGravixLayer()
memory = client.memory
# Switch configuration
await memory.switch_configuration(
embedding_model="microsoft/multilingual-e5-large",
index_name="user_preferences"
)
# Add multilingual memories
await memory.add("El usuario prefiere pizza", user_id="alice")
results = await memory.search("comida", user_id="alice")
Async Conversation Processing
from gravixlayer import AsyncGravixLayer
async def process_conversation():
client = AsyncGravixLayer()
memory = client.memory
conversation = [
{"role": "user", "content": "I love sci-fi movies"},
{"role": "assistant", "content": "Great! I'll remember that."}
]
# AI extracts memories from conversation
result = await memory.add(conversation, user_id="alice", infer=True)
print(f"Extracted {len(result['results'])} memories")