OpenClaw memory plugin backed by Cloudflare Vectorize for storage/search and Workers AI for embeddings.
kind: "memory"OpenClaw plugin- Cloudflare Workers AI embedding provider adapter:
cloudflare-workers-ai - Cloudflare-backed memory tools:
cloudflare_memory_searchcloudflare_memory_getcloudflare_memory_upsertcloudflare_memory_delete
- CLI commands under
cf-memory - Two storage modes:
vectorize-inline(default): stores retrievable text directly in Vectorize metadatacompanion-store: stores vectors in Vectorize and full payloads in a local JSON sidecar
- Migration support for legacy markdown-based memory corpora
- Node 22+
- OpenClaw 2026.4.11+
- A Cloudflare API token with the permissions needed for:
- Workers AI
- Vectorize
If you want the plugin to create the index for you, the token must include write permissions.
openclaw plugins install openclaw-cloudflare-vectorize-memoryPublish the package with the included openclaw.plugin.json manifest and install it through normal ClawHub/OpenClaw plugin flows.
OpenClaw uses the plugin manifest id memory-cloudflare-vectorize as the config key, so plugin config stays under plugins.entries.memory-cloudflare-vectorize.
This package also ships an optional managed hook named cloudflare-memory-bootstrap.
Enable it after installation with:
openclaw hooks enable cloudflare-memory-bootstrapWhen enabled, the hook injects packaged bootstrap guidance so agents know the Cloudflare memory plugin is installed and can point operators at openclaw cf-memory doctor.
After authenticating with npm for the target package owner, publish with:
npm run publish:npmjsThe script runs check, test, and build before calling npm publish --access public.
Before publishing, fill in the placeholder values in .env, set OPENCLAW_CF_MEMORY_RUN_LIVE_INTEGRATION=1, and run:
npm run test:integrationThe live integration suite rebuilds the package first, resolves Cloudflare settings from .env, then fills any untouched placeholder values from .env.example, and exercises doctor --create-index plus a real upsert / search / delete round-trip against the configured backend. The default npm test command keeps running only the fast local test suite.
The repository includes .env and .env.example templates for the live integration tests. Replace the placeholder values in .env before enabling the suite.
Cloudflare-standard variables:
set CLOUDFLARE_ACCOUNT_ID=your-account-id
set CLOUDFLARE_API_TOKEN=your-api-tokenPlugin-specific variables:
set CLOUDFLARE_VECTORIZE_INDEX_NAME=openclaw-memory
set CLOUDFLARE_WORKERS_AI_EMBEDDING_MODEL=@cf/baai/bge-base-en-v1.5
set CLOUDFLARE_VECTORIZE_TOP_K=5Optional:
set CLOUDFLARE_VECTORIZE_NAMESPACE=my-shared-namespace
set OPENCLAW_CF_MEMORY_STORAGE_MODE=companion-store
set OPENCLAW_CF_MEMORY_COMPANION_PATH=C:\path\to\companion-store.json
set OPENCLAW_CF_MEMORY_TEST_NAMESPACE_PREFIX=cf-memory-liveIf CLOUDFLARE_VECTORIZE_NAMESPACE is omitted, the plugin derives namespaces from the active OpenClaw agent/session when possible.
{
"plugins": {
"entries": {
"memory-cloudflare-vectorize": {
"vectorize": {
"indexName": "openclaw-memory",
"topK": 8,
"createIndex": {
"metric": "cosine"
},
"metadataIndexedFields": ["topic", "tenant"]
},
"embeddings": {
"model": "@cf/baai/bge-base-en-v1.5"
},
"storage": {
"mode": "vectorize-inline"
}
}
}
}
}You can also store cloudflare.apiToken as an OpenClaw secret ref instead of plaintext.
Run:
openclaw cf-memory initto create or repair the configured Vectorize index so it matches the active embedding model dimensions.
Validate configuration without changing infrastructure:
openclaw cf-memory doctorValidate configuration and create the Vectorize index when missing:
openclaw cf-memory doctor --create-indexRun an end-to-end smoke test that verifies embedding, write, search, and cleanup:
openclaw cf-memory testThe doctor flow checks:
- Cloudflare credentials
- Vectorize index reachability
- Workers AI embedding dimensions
- embedding/index dimension compatibility
- metadata-index guidance for filter-heavy queries
Initialize or repair the Vectorize index:
openclaw cf-memory initRun a smoke test:
openclaw cf-memory testMigrate the default OpenClaw markdown memory corpus from the current workspace:
openclaw cf-memory migratePreview a migration without writing anything:
openclaw cf-memory migrate --dry-runMigrate specific markdown directories or glob patterns:
openclaw cf-memory migrate memories docs\notes\*.mdImport everything into a single namespace override:
openclaw cf-memory migrate memories --namespace imported-legacyDerive namespaces from the first relative path segment instead:
openclaw cf-memory migrate memories --derive-namespace-from-pathControl duplicate handling on reruns:
openclaw cf-memory migrate memories --if-exists skipBy default, migrate overwrites records with the same derived logical id so reruns refresh previously imported content. Supported v1 sources are:
- explicit markdown files, directories, and glob patterns
- the default OpenClaw memory provider's readable markdown corpus when no sources are passed
The migration command stores the original source path in record metadata and reuses the normal Cloudflare upsert pipeline so embeddings, namespace handling, and storage-mode behavior stay consistent.
Search:
openclaw cf-memory search "preferred coding style" --limit 5Upsert:
openclaw cf-memory upsert "Use Vitest for plugin tests." --id testing-style --metadata "{\"topic\":\"testing\"}"Delete:
openclaw cf-memory delete testing-stylevectorize-inlineis the easiest mode, but it is limited by Vectorize metadata size limits.- Use
companion-storewhen memory payloads are too large to fit comfortably in metadata. - Metadata filters in Vectorize require metadata indexes on the Cloudflare side. Configure those before relying on filter-heavy recall.