Getting Started
This guide will walk you through deploying Mulberry, creating an account, generating API keys, and making your first API call.
1. Deploy Mulberry
Mulberry is designed to run on your own infrastructure. The fastest way to get started is with Docker Compose.
Prerequisites
- A server or VM with Docker and Docker Compose installed
- At least 2 vCPU and 4GB RAM recommended
- A domain name (optional, for SSL)
Quick Install
git clone https://github.com/goodway/mulberry
cd mulberry
docker compose up -d This starts Mulberry with PostgreSQL and all required services. By default, the application runs on port 4000.
Environment Configuration
Copy the example environment file and configure your settings:
cp .env.example .env Key settings to configure:
SECRET_KEY_BASE- Generate withmix phx.gen.secretDATABASE_URL- PostgreSQL connection stringPHX_HOST- Your domain name
2. Create Your Account
Once Mulberry is running, navigate to your server's URL and sign up for an account.
- Visit
https://your-server/ - Click "Sign Up" or "Get Started"
- Enter your email address
- Check your email for a magic link (or set a password if password auth is enabled)
- Click the link to complete registration
3. Generate API Keys
API keys authenticate your requests to the Mulberry API. You can create keys from the dashboard.
- Navigate to Settings → API Keys
- Click "Create API Key"
- Choose key type:
- Private key (
sk_) - Full read/write access - Public key (
pk_) - Read-only access
- Private key (
- Give your key a descriptive name
- Optionally set an expiration date
- Copy the key immediately - it won't be shown again
4. Make Your First API Call
Test your setup by listing your crawls (which will be empty initially):
curl -H "Authorization: Bearer sk_your_api_key" \
https://your-server/api/crawls You should receive a response like:
{
"data": [],
"meta": {
"total": 0,
"page": 1,
"per_page": 20
}
} Now create your first crawl:
curl -X POST \
-H "Authorization: Bearer sk_your_api_key" \
-H "Content-Type: application/json" \
-d '{"url": "https://example.com", "depth": 1}' \
https://your-server/api/crawls The response includes your new crawl's ID and status:
{
"data": {
"id": "crawl_abc123",
"url": "https://example.com",
"status": "pending",
"depth": 1,
"created_at": "2024-01-15T10:30:00Z"
}
}