Appearance
Recipe: Anthropic + Generative DOM
Stream Claude's responses into the DOM using the official
@anthropic-ai/sdkstream helpers.
Why this recipe
The Anthropic SDK exposes two streaming shapes: a raw event stream and a higher-level messages.stream() helper with typed events (text, contentBlock, message, error). Most apps want the higher-level helper — it handles content block assembly for you. Generative DOM slots in at the text event, where each payload is an incremental string delta. You push the delta, Generative DOM diffs, the DOM updates.
What you need
- Node 18+ or a server-side runtime (Anthropic's SDK is not designed for direct browser use)
@generative-dom/coreplus plugins- Optional:
@generative-dom/react @anthropic-ai/sdkv0.27+ (the typed-events stream helper API)
sh
pnpm add @anthropic-ai/sdk @generative-dom/core @generative-dom/plugin-markdown-base \
@generative-dom/plugin-markdown-inline @generative-dom/plugin-markdown-heading \
@generative-dom/plugin-markdown-code @generative-dom/plugin-markdown-listVanilla example
Run this in Node or on your backend. Stream the deltas to the browser over SSE or a ReadableStream and feed them to Generative DOM there — see SSE and fetch-streams for the transport.
ts
import Anthropic from '@anthropic-ai/sdk';
import { GenerativeDom } from '@generative-dom/core';
import { markdownBase } from '@generative-dom/plugin-markdown-base';
import { markdownInline } from '@generative-dom/plugin-markdown-inline';
import { markdownHeading } from '@generative-dom/plugin-markdown-heading';
import { markdownCode } from '@generative-dom/plugin-markdown-code';
import { markdownList } from '@generative-dom/plugin-markdown-list';
const client = new Anthropic(); // reads ANTHROPIC_API_KEY from env
const container = document.getElementById('chat')!;
const md = new GenerativeDom({
container,
plugins: [
markdownBase(),
markdownInline(),
markdownHeading(),
markdownCode(),
markdownList(),
],
});
export async function streamAnswer(prompt: string): Promise<void> {
md.reset();
const stream = client.messages.stream({
model: 'claude-sonnet-4-5',
max_tokens: 1024,
messages: [{ role: 'user', content: prompt }],
// see official SDK docs for full options
});
stream.on('text', (delta) => {
md.push(delta);
});
await stream.finalMessage(); // resolves when the stream completes
md.flush();
}The text event fires once per incremental string — already decoded, already stripped of JSON envelope. You don't need to parse anything.
If you prefer for await:
ts
const stream = client.messages.stream({
model: 'claude-sonnet-4-5',
max_tokens: 1024,
messages: [{ role: 'user', content: prompt }],
});
for await (const event of stream) {
if (event.type === 'content_block_delta' && event.delta.type === 'text_delta') {
md.push(event.delta.text);
}
}
md.flush();Either style works. The event handler form is less code; the iterator form lets you inspect other event types (tool use, stop reasons) inline.
React example
tsx
import { useCallback, useMemo } from 'react';
import { useGenerativeDom } from '@generative-dom/react';
import { markdownBase } from '@generative-dom/plugin-markdown-base';
import { markdownInline } from '@generative-dom/plugin-markdown-inline';
import { markdownHeading } from '@generative-dom/plugin-markdown-heading';
import { markdownCode } from '@generative-dom/plugin-markdown-code';
import { markdownList } from '@generative-dom/plugin-markdown-list';
// Talk to Anthropic through your own server endpoint that streams text/event-stream.
async function* fetchClaudeDeltas(prompt: string): AsyncGenerator<string> {
const res = await fetch('/api/claude', {
method: 'POST',
body: JSON.stringify({ prompt }),
headers: { 'content-type': 'application/json' },
});
const reader = res.body!.getReader();
const decoder = new TextDecoder();
while (true) {
const { done, value } = await reader.read();
if (done) return;
yield decoder.decode(value, { stream: true });
}
}
export function Answer() {
const plugins = useMemo(
() => [
markdownBase(),
markdownInline(),
markdownHeading(),
markdownCode(),
markdownList(),
],
[],
);
const { ref, push, flush, reset } = useGenerativeDom({ plugins });
const ask = useCallback(
async (prompt: string) => {
reset();
for await (const delta of fetchClaudeDeltas(prompt)) push(delta);
flush();
},
[push, flush, reset],
);
return (
<>
<button onClick={() => ask('Summarize the CAP theorem.')}>Ask Claude</button>
<div ref={ref} className="prose" />
</>
);
}The server route is where the @anthropic-ai/sdk call actually lives — it streams event.delta.text out of messages.stream() into its response body. The browser receives a plain text stream and pipes chunks into Generative DOM.
What this gets you
- One-line integration at the
textevent — no delta assembly code - Correct handling of chunk boundaries that split markdown tokens (e.g.
**arriving across two deltas) - Works with tool use: non-text events are ignored, text events flow straight through
- Same rendering pipeline whether you stream from server-side
messages.stream()or from a browser-side SSE/fetch proxy
Common pitfalls
- Running the SDK in the browser —
@anthropic-ai/sdkassumes a server environment and your API key must stay on the server. Proxy through an endpoint you own. - Listening to
contentBlockinstead oftext—contentBlockfires once per completed block, not per token. You'll render in large chunks instead of streaming. - Not awaiting
finalMessage()— if youflush()before the stream finishes, the tail of the response renders via the next tick instead of immediately. - Tool-use responses — when Claude emits a tool call, only the text blocks should hit Generative DOM. Filter event types in the iterator form.
Related
- OpenAI recipe — the same pattern with a different SDK
- SSE recipe — the transport layer for browser clients
- Events guide — hook into Generative DOM plugin events (e.g. link clicks) while rendering