- Deno, latest version: For the server, SDK, and CLI.
- Node.js, v20 or higher: For the console and documentation.
- Git: For version control.
Setup the console
The console orchestrates all other resources. In local mode, it uses a mock identity provider file.Configure environment
Copy the example environment file. Skip the WorkOS variables to trigger local development mode.
Run console
Booting world instances
Once the console runs, it automatically manages local Worlds API processes.- Automatic boot: On startup, the console checks the local identity database and spawns a server process for every local organization.
- Dynamic provisioning: When you create a new organization in the console
UI:
- The system generates a local API key.
- The system creates a local organization database.
- The system spawns a new Worlds API server on an available local port.
Customizing the local user
In local development mode, you can customize the identity of the mock user session by setting environment variables in the Console configuration file:Configuration reference
| Variable | Description |
|---|---|
LIBSQL_URL | Optional. SQLite URL. Defaults to file:./worlds.db. |
LOCAL_USER_EMAIL | Optional. Email for the mock local user. |
LOCAL_USER_FIRST_NAME | Optional. First name for the mock local user. |
LOCAL_USER_ID | Optional. User ID for the mock local user. |
LOCAL_USER_LAST_NAME | Optional. Last name for the mock local user. |
OLLAMA_BASE_URL | Optional. Ollama API URL. Defaults to http://localhost:11434. |
OLLAMA_EMBEDDINGS_MODEL | Optional. Ollama model. Defaults to nomic-embed-text. |
OPENROUTER_API_KEY | Optional. API key for cloud-based embeddings via OpenRouter. |
OPENROUTER_EMBEDDINGS_MODEL | Optional. OpenRouter model. Defaults to openai/text-embedding-3-small. |
WORLDS_BASE_DIR | Optional. Base directory for world data. Defaults to ./worlds. |
WORLDS_EMBEDDINGS_DIMENSIONS | Optional. Unified vector dimensions for DB and AI. Defaults to 768. |
Local embeddings with Ollama
By default, the Worlds Platform uses Ollama for local embeddings.Install Ollama
Download and install Ollama from ollama.com.