Skip to main content
The final step in mastering Worlds is connecting an AI agent, such as Gemini, Claude, or ChatGPT, to your world memory.

Neuro-symbolic architecture

This integration bridges two paradigms:
  • Neural: The Large Language Model (LLM) that processes natural language.
  • Symbolic: The Worlds Graph that stores structured facts and follows logic.
By providing the LLM with tools to read and write to the graph, you enable the agent to maintain a persistent state and perform complex reasoning.

Using the AI SDK

The worlds-ai-sdk provides pre-built tools for frameworks like the Vercel AI SDK, making integration straightforward.
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
import { WorldsSdk } from "@wazoo/worlds-sdk";
import { createTools } from "@wazoo/worlds-ai-sdk";

const sdk = new WorldsSdk({
  baseUrl: "http://localhost:8000",
  apiKey: "your-api-key",
});

const { text } = await generateText({
  model: openai("gpt-4o"),
  tools: createTools({
    sdk,
    sources: [{ id: "my-knowledge-base" }],
  }),
  prompt: "Find all team members and summarize their roles.",
});

System flow

  1. User prompt: You ask the agent a question.
  2. Tool selection: The agent determines it needs information from its “World” and triggers a query tool.
  3. Execution: The worlds-ai-sdk executes a SPARQL query against the Worlds API.
  4. Inference: The graph returns structured data, which the agent uses to generate a factual response.
To learn more about connecting and monitoring these integrations in production, see our guides on Data Integration and Monitoring.

Next step

Proceed to Commencement to claim your certificate and explore the next stage of your journey.