Local Deployment
This guide covers setting up Reservoir for local development and production use on your local machine.
Prerequisites
Before deploying Reservoir locally, ensure you have the following installed:
- Rust (latest stable version)
- Docker (for Neo4j database)
- Git for version control
Quick Setup
Step 1: Clone the Repository
git clone https://github.com/divanvisagie/reservoir.git
cd reservoir
Step 2: Start Neo4j Database
You have several options for running Neo4j locally:
Option A: Docker Compose (Recommended)
docker-compose up -d
This starts Neo4j on the default bolt://localhost:7687
with the credentials defined in the docker-compose file.
Option B: Docker Manual Setup
docker run \
--name neo4j \
-p7474:7474 -p7687:7687 \
-d \
-v $HOME/neo4j/data:/data \
-v $HOME/neo4j/logs:/logs \
-v $HOME/neo4j/import:/var/lib/neo4j/import \
-v $HOME/neo4j/plugins:/plugins \
--env NEO4J_AUTH=neo4j/password \
neo4j:latest
Option C: Homebrew (macOS Service)
If you prefer to run Neo4j as a permanent background service:
brew install neo4j
brew services start neo4j
This will start Neo4j on bolt://localhost:7687
and ensure it runs automatically when your computer boots.
Step 3: Configure Environment Variables
Create a .env
file in the project root or export the following environment variables:
# Server Configuration
RESERVOIR_PORT=3017
RESERVOIR_HOST=127.0.0.1
# Database Configuration
NEO4J_URI=bolt://localhost:7687
NEO4J_USERNAME=neo4j
NEO4J_PASSWORD=password
# API Keys (required for respective providers)
OPENAI_API_KEY=sk-your-openai-key-here
MISTRAL_API_KEY=your-mistral-key-here
GEMINI_API_KEY=your-gemini-key-here
# Custom Provider URLs (optional)
RSV_OPENAI_BASE_URL=https://api.openai.com/v1/chat/completions
RSV_OLLAMA_BASE_URL=http://localhost:11434/v1/chat/completions
RSV_MISTRAL_BASE_URL=https://api.mistral.ai/v1/chat/completions
Note: Most environment variables have sensible defaults. Only the API keys for your chosen providers are required.
Step 4: Build and Run
Manual Execution
# Build the project
cargo build --release
# Run Reservoir
cargo run -- start
Using Make Commands
# Build the release binary
make main
# Run for development (with auto-reload)
make dev
# Run normally
make run
Reservoir will now be available at http://localhost:3017
.
Service Installation (macOS)
For a more permanent setup, you can install Reservoir as a macOS LaunchAgent service.
Install the Service
make install-service
This command:
- Copies the LaunchAgent plist to
~/Library/LaunchAgents/
- Loads the service using
launchctl
- Starts Reservoir automatically in the background
Service Management
Check service status:
launchctl list | grep reservoir
View service logs:
tail -f /tmp/reservoir.log
tail -f /tmp/reservoir.err
Manually start/stop the service:
# Start
launchctl start com.sectorflabs.reservoir
# Stop
launchctl stop com.sectorflabs.reservoir
Uninstall the Service
make uninstall-service
This removes the service and cleans up all related files.
Verification
Test the Installation
-
Check if Reservoir is running:
curl http://localhost:3017/health
-
Test with a simple API call:
curl "http://127.0.0.1:3017/partition/$USER/instance/test/v1/chat/completions" \ -H "Content-Type: application/json" \ -H "Authorization: Bearer $OPENAI_API_KEY" \ -d '{ "model": "gpt-4", "messages": [ { "role": "user", "content": "Hello, Reservoir!" } ] }'
-
Run the test suite:
./hurl/test.sh
Check Neo4j Connection
Verify that Neo4j is accessible:
# Check Neo4j web interface
open http://localhost:7474
# Test connection with curl
curl -u neo4j:password http://localhost:7474/db/data/
Configuration Options
Database Configuration
Variable | Default | Description |
---|---|---|
NEO4J_URI | bolt://localhost:7687 | Neo4j connection URI |
NEO4J_USERNAME | neo4j | Database username |
NEO4J_PASSWORD | password | Database password |
Server Configuration
Variable | Default | Description |
---|---|---|
RESERVOIR_PORT | 3017 | HTTP server port |
RESERVOIR_HOST | 127.0.0.1 | HTTP server host |
Provider Configuration
Variable | Default | Description |
---|---|---|
RSV_OPENAI_BASE_URL | https://api.openai.com/v1/chat/completions | OpenAI API endpoint |
RSV_OLLAMA_BASE_URL | http://localhost:11434/v1/chat/completions | Ollama API endpoint |
RSV_MISTRAL_BASE_URL | https://api.mistral.ai/v1/chat/completions | Mistral API endpoint |
Troubleshooting
Common Issues
Port Already in Use:
# Check what's using port 3017
lsof -i :3017
# Use a different port
export RESERVOIR_PORT=3018
Neo4j Connection Failed:
# Check if Neo4j is running
docker ps | grep neo4j
# Check Neo4j logs
docker logs neo4j
Permission Issues (macOS Service):
# Ensure the binary path is correct in the plist
ls -la ~/.cargo/bin/reservoir
# Update the path in scripts/com.sectorflabs.reservoir.plist if needed
API Key Issues:
# Verify your API key is set
echo $OPENAI_API_KEY
# Test the key directly with OpenAI
curl https://api.openai.com/v1/models \
-H "Authorization: Bearer $OPENAI_API_KEY"
Performance Tuning
For better performance in local deployment:
-
Increase Neo4j memory allocation:
# In docker-compose.yml, add: NEO4J_dbms_memory_heap_initial__size=512m NEO4J_dbms_memory_heap_max__size=2G
-
Use SSD storage for Neo4j data:
# Mount Neo4j data on fast storage -v /path/to/fast/storage:/data
-
Optimize connection pooling:
# Add to .env NEO4J_MAX_CONNECTIONS=20 NEO4J_CONNECTION_TIMEOUT=30s
Next Steps
After successful local deployment:
- Configure Environment Variables
- Set up Production Deployment
- Learn about API Usage
- Explore Chat Gipitty Integration
Your Reservoir instance is now ready for local development and testing!