Skip to main content
The Worlds Platform utilizes a managed neuro-symbolic infrastructure designed for edge-distributed, agentic memory. Built on the Deno runtime, it separates the Worlds Console for management from the Worlds API for high-performance execution.

The connection model

The platform connection model relies on three core pillars:
  • Your AI Agent (The Client): Use the @wazoo/worlds-sdk or standard HTTP requests inside your agent (e.g., Claude, Gemini, or a custom script) to connect to the platform.
  • The World (The Memory): A “World” is an isolated, secure sandbox for your agent’s memories. Every request securely targets a specific World using an API key and the World’s unique ID.
  • The Console (The Dashboard): Use the Worlds Console to generate API keys, visually explore stored data, and manage billing.
In short: Your agent uses an API key to read and write facts to a specific World.

High-level overview

The diagram illustrates the relationship between the management layer, reasoning engine, and isolated persistent state.

Worlds Console vs. Worlds API

The platform splits operations into two primary layers:

Worlds Console

The Worlds Console acts as the system’s control plane. It manages identity through WorkOS, handles organization-level provisioning, and orchestrates Worlds API instances.

Worlds API

The Worlds API Server handles RDF graph management, SPARQL execution, and hybrid search. This is the API layer where your information lives.

Worlds API deep dive

The World Engine follows a multi-tiered data transformation journey, fusing diverse input sources into a verifiable neuro-symbolic knowledge state.

Resource hierarchy

Organizations host Worlds to ensure strict data isolation and scale.

Worlds

Each World is a specific context or knowledge graph managed by the server.
  • Dedicated storage: Each World maintains its own secondary SQLite database for triples, chunks, and embeddings.
  • Isolation: Worlds are accessed via /v1/worlds/{id}, ensuring zero cross-contamination between contexts.

Deno runtime

The Worlds Platform is built on Deno, which provides several advantages for a security-sensitive knowledge platform:
  • Secure by default: Deno’s permission model requires explicit grants for network, file system, and environment access — reducing the attack surface of each deployment.
  • Web-standard APIs: The server exports a standard fetch handler, making it natively compatible with Deno Deploy and other edge runtimes.
  • TypeScript-native: No build step or transpiler configuration required. The entire codebase is TypeScript from source to execution.
  • Edge-ready: First-class support for Deno Deploy enables low-latency deployments close to users.

Monorepo topology

The ecosystem uses a Deno workspace. The sdk package serves as the primary bridge for the CLI, AI-SDK, and Console to communicate with the API Server.

Repository layout

The server follows a modular layout organized by service and resource:
  • lib/: Shared logic for RDF/SPARQL handling, database management, and embeddings.
  • middleware/: Authentication guards.
  • routes/: Implementation of the v1 API endpoints.

Request flow

The Worlds Server follows a structured lifecycle for initialization and request handling. For a detailed breakdown, refer to the Request flow reference.

Design principles

Polymorphic resource managers

A key design feature is the use of hot-swappable resource managers. The core logic remains identical, while the implementation swaps based on the environment:
ResourceLocal developmentProduction
ComputeLocal child processesDeno Deploy edge runtime
StorageLocal SQLite filesSQLite / Turso
IdentityMock identity fileWorkOS Identity Service
This pattern allows the entire stack to run locally with zero cloud dependencies.