Episode Details
Back to Episodes
Episode 26: OpenClaw Gets a Brain Transplant, Glasswing, Giant Brains, and Cloned Writers
Episode 26
Published 1 week ago
Description
[00:00] INTRO / HOOK
OpenClaw 2026.4.8 drops a unified inference layer, session checkpointing,
and a restored memory stack. Anthropic's Glasswing coalition, MegaTrain's
single-GPU frontier training, and a study proving your writing AI might
just be a Claude knockoff.
[02:00] STORY 1 — OpenClaw 2026.4.8: The Release That Changes How It All Works
Six major subsystems land in one release.
The first is the infer hub CLI — openclaw infer hub — a unified interface
for provider-backed inference across model tasks, media generation, web
search, and embeddings. It routes requests to the right provider, handles
auth, remaps parameters across provider capability differences, and
falls back automatically if a provider is down or rate-limited. If you
have been managing multiple provider configs across different workflows,
the hub becomes the single abstraction layer. Provider switches become
config changes at the hub level; the rest of your workflow is unchanged.
The second is the media generation auto-fallback system, covering image,
music, and video. If your primary provider is unavailable or does not
support the specific capability you requested — aspect ratio, duration,
format — OpenClaw routes to the next configured provider and adjusts
parameters automatically. One failed generation is an inconvenience. A
thousand per day across a production fleet is an operational problem. This
is handled once at the platform level; every agent benefits immediately.
The third is the sessions UI branch and restore functionality. When
context compaction runs, the system now snapshots session state before
summarising. Operators can use the Sessions UI to inspect checkpoints and
restore to a pre-compaction state, or use any checkpoint as a branch point
to explore a different direction without losing the original thread. This
is version history for session context — the difference between editing
with autosave and editing where every save overwrites the previous file.
The fourth is the full restoration of the memory and wiki stack. This
includes structured claim and evidence fields, compiled digest retrieval,
claim-health linting, contradiction clustering, staleness dashboards, and
freshness-weighted search. Claims can be tagged with supporting evidence,
linted for internal consistency, and grouped where they contradict each
other. Search results are ranked by recency, not just relevance. If you
have been working around missing pieces in prior versions, this is the
native implementation — test your workflow against it.
The fifth is the webhook ingress plugin. Per-route shared-secret endpoints
let external systems authenticate and trigger bound TaskFlows directly —
CI pipelines, monitoring tools, scheduled jobs, third-party webhooks —
without custom integration code. The plugin handles routing, auth, and
workflow binding.
The sixth is the pluggable compaction provider registry. You can now route
context compaction to a different model or service via
agents.defaults.compaction.provider — a faster, cheaper model optimised
for summarisation rather than the most capable model you have. Falls back
to built-in LLM summarisation on failure. At scale, compaction is
happening constantly; routing it appropriately matters for cost and
latency.
Other notable additions: Google Gemma 4 is now natively supported with
thinking semantics preserved and Google fallback resolution fixed. Claude
CLI is restored as the preferred local Anthropic path across onboarding,
doctor flows, and Docker live lanes. Ollama vision models now accept image
attachments natively — vision capability is detected from /api/show, no
workarounds required. The memory and dreaming system ingests redacted
session transcripts into the dreaming corpus with per-day session-corpus
notes and cursor checkpointing. A new bundled Arcee AI provider plugin
with Trinity catalog entries and OpenRouter support. Context engine changes
expose availableTools, citationsMode, and memory artifact seams to
companion plug