Server SDK
Reference for the runtime sandbox/thread API from the selected server SDK package @21st-sdk/node, plus the token exchange helpers from @21st-sdk/nextjs/server.
agent, sandboxId, and threadId. Use @21st-sdk/node to create and drive those resources from your server code.How the layers fit together
- The sandbox config in your agent definition sets the deploy-time template build and per-sandbox setup for a deployed agent. See Sandbox.
AgentClientin@21st-sdk/nodecreates and manages runtime sandboxes, threads, files, commands, git clones, and tokens.@21st-sdk/nextjs/serveris only a token exchange helper for Next.js route handlers.- Remote tool executions receive
client,sandboxIdinsideexecute(args, context). Read env vars fromprocess.env. See Sandbox. - For raw HTTP endpoints and SSE behavior, see the API Reference.
1. Server SDK client
@21st-sdk/node is the package that actually manages sandboxes, threads, and tokens from server code.
import { AgentClient } from "@21st-sdk/node"
const client = new AgentClient({
apiKey: process.env.API_KEY_21ST!,
})
// Create a sandbox (with optional overrides)
const sandbox = await client.sandboxes.create({
agent: "support-agent",
})
// Clone a repo into the sandbox
await client.sandboxes.git.clone({
sandboxId: sandbox.id,
url: "https://github.com/org/repo.git",
depth: 1,
})
// Execute a command
const result = await client.sandboxes.exec({
sandboxId: sandbox.id,
command: "ls /home/user/workspace/repo",
})
// Write and read files
await client.sandboxes.files.write({
sandboxId: sandbox.id,
files: {
"/home/user/workspace/note.txt": "Hello!",
},
})
const file = await client.sandboxes.files.read({
sandboxId: sandbox.id,
path: "/home/user/workspace/note.txt",
})
// Threads & tokens
const thread = await client.threads.create({
sandboxId: sandbox.id,
name: "Chat 1",
})
const token = await client.tokens.create({ agent: "support-agent" })
// Cleanup
await client.sandboxes.delete(sandbox.id)AgentClient
const client = new AgentClient({
apiKey: process.env.API_KEY_21ST!,
})sandboxes.create() and reuse sandboxes.exec(), sandboxes.files.read(), sandboxes.files.write(), and sandboxes.git.clone() methods whenever your own backend needs to revisit that sandbox later.Sandboxes
| Method | Current behavior |
|---|---|
sandboxes.create({ agent, files?, envs?, setup? }) | Creates a runtime sandbox for a deployed agent. Uses the agent's deployed sandbox config, then applies runtime overrides. |
sandboxes.get(sandboxId) | Returns sandbox status, error state, agent info, and thread summaries. |
sandboxes.delete(sandboxId) | Deletes the runtime sandbox and cascades deletion to its threads. |
sandboxes.exec({ sandboxId, command, cwd?, envs?, timeoutMs? }) | Runs a shell command inside the sandbox and returns stdout, stderr, and exit code. |
sandboxes.files.write({ sandboxId, files }) | Writes one or more files into the sandbox. |
sandboxes.files.read({ sandboxId, path }) | Reads one file from the sandbox and returns its content. |
sandboxes.git.clone({ sandboxId, url, path?, token?, depth? }) | Clones a git repo into the sandbox. |
Threads
| Method | Current behavior |
|---|---|
threads.list({ sandboxId }) | Lists thread summaries for one runtime sandbox. |
threads.create({ sandboxId, name? }) | Creates a new thread inside an existing runtime sandbox. |
threads.get({ sandboxId, threadId }) | Returns one thread, including persisted messages when available. |
threads.delete({ sandboxId, threadId }) | Deletes a single thread from the sandbox. |
threads.run({ agent, messages, sandboxId?, threadId?, name?, options? }) | Convenience method that can auto-create a sandbox and thread, then POST to the chat stream endpoint. |
Run threads from server
If you omit the sandbox/thread identifiers for the selected language, the missing values are created automatically.
The response stream uses the Vercel AI SDK UI message stream protocol over SSE. Your server should treat result.response as a stream to proxy or consume, not as a single buffered JSON payload.
messages array every time. The relay uses the last user message as the next chat turn.const result = await client.threads.run({
agent: "support-agent",
sandboxId: sandbox.id,
threadId: thread.id,
messages: [
{
role: "user",
parts: [{ type: "text", text: "Check the refund policy for order #1234" }],
},
],
})
// Vercel AI SDK UI message stream from the relay
const stream = result.response.body
// Reconnect/cancel URL for this active stream
console.log(result.resumeUrl)Per-run runtime options
Pass options when one backend request should temporarily override the deployed agent config. This is useful for one-off reviews, stricter budgets, or task-specific prompting.
systemPrompt, maxTurns, maxBudgetUsd, and disallowedTools.const result = await client.threads.run({
agent: "support-agent",
sandboxId: sandbox.id,
threadId: thread.id,
messages: [
{
role: "user",
parts: [{ type: "text", text: "Review the checkout flow before release." }],
},
],
options: {
systemPrompt: {
type: "preset",
preset: "claude_code",
append: "Focus on regressions, risky edge cases, and missing tests. Do not edit files.",
},
maxTurns: 4,
maxBudgetUsd: 0.2,
disallowedTools: ["Bash"],
},
})Tokens
client.tokens.create({ agent?, userId?, expiresIn? }) returns a short-lived JWT for client-side chat access. Default expiresIn is "1h".
2. @21st-sdk/nextjs/server
@21st-sdk/nextjs/server only handles token exchange for browser chat. For now, that is the entire server-side surface of this package.
Create a token route
import { createTokenHandler } from "@21st-sdk/nextjs/server"
export const POST = createTokenHandler({
apiKey: process.env.API_KEY_21ST!,
})Low-level token exchange
Use exchangeToken() only if you want to wrap the token exchange yourself.
import { exchangeToken } from "@21st-sdk/nextjs/server"
export async function POST(req: Request) {
const { agent, userId } = await req.json() as {
agent?: string
userId?: string
}
const data = await exchangeToken({
apiKey: process.env.API_KEY_21ST!,
agent,
userId,
expiresIn: "1h",
})
return Response.json(data)
}3. Session patterns
The server SDK gives you sandbox and thread primitives. If you want persistent sessions, you store those IDs yourself and reuse them on later requests.
Create a new session
import { AgentClient } from "@21st-sdk/node"
import { db } from "@/db"
const client = new AgentClient({
apiKey: process.env.API_KEY_21ST!,
})
export async function createSession(userId: string) {
const sandbox = await client.sandboxes.create({ agent: "support-agent" })
const thread = await client.threads.create({
sandboxId: sandbox.id,
name: "New chat",
})
return db.agentSession.create({
data: {
userId,
agent: "support-agent",
sandboxId: sandbox.id,
threadId: thread.id,
},
})
}Continue an existing session
Load the stored IDs, call threads.run(), and stream the SDK response back to the client. The request body should include the full messages array your UI is currently rendering.
import { AgentClient } from "@21st-sdk/node"
import type { UIMessage } from "ai"
import { db } from "@/db"
const client = new AgentClient({
apiKey: process.env.API_KEY_21ST!,
})
export async function POST(
req: Request,
{ params }: { params: Promise<{ sessionId: string }> },
) {
const { sessionId } = await params
const { messages } = await req.json() as { messages: UIMessage[] }
const session = await db.agentSession.findUniqueOrThrow({
where: { id: sessionId },
})
const result = await client.threads.run({
agent: session.agent,
sandboxId: session.sandboxId,
threadId: session.threadId,
messages,
})
return new Response(result.response.body, {
headers: {
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache",
Connection: "keep-alive",
},
})
}Create a new thread in the same sandbox
Use this when you want a clean conversation but want to keep the same filesystem, cloned repos, and sandbox state.
import { AgentClient } from "@21st-sdk/node"
import { db } from "@/db"
const client = new AgentClient({
apiKey: process.env.API_KEY_21ST!,
})
export async function createFollowUpSession(sessionId: string) {
const existing = await db.agentSession.findUniqueOrThrow({
where: { id: sessionId },
})
const thread = await client.threads.create({
sandboxId: existing.sandboxId,
name: "Follow-up chat",
})
return db.agentSession.create({
data: {
userId: existing.userId,
agent: existing.agent,
sandboxId: existing.sandboxId,
threadId: thread.id,
},
})
}Hydrate the UI and resume active streams
This is one restore flow. On the server, call threads.get() and load both the persisted messages array and the thread status. Pass the messages to the client as initialMessages and derive resumeStreamOnMount from thread.status === "streaming".
import { AgentClient } from "@21st-sdk/node"
import type { UIMessage } from "ai"
import { db } from "@/db"
import { SessionChat } from "@/components/session-chat"
const client = new AgentClient({
apiKey: process.env.API_KEY_21ST!,
})
async function getSessionState(sessionId: string) {
const session = await db.agentSession.findUniqueOrThrow({
where: { id: sessionId },
})
const thread = await client.threads.get({
sandboxId: session.sandboxId,
threadId: session.threadId,
})
return {
agent: session.agent,
sandboxId: session.sandboxId,
threadId: session.threadId,
initialMessages: (thread.messages as UIMessage[] | undefined) ?? [],
resumeStreamOnMount: thread.status === "streaming",
}
}
export default async function SessionPage(
props: { params: Promise<{ sessionId: string }> },
) {
const { sessionId } = await props.params
const session = await getSessionState(sessionId)
return <SessionChat {...session} />
}"use client"
import { useChat } from "@ai-sdk/react"
import { AgentChat, createAgentChat } from "@21st-sdk/react"
import type { UIMessage } from "ai"
import "@21st-sdk/react/styles.css"
import { useEffect, useMemo, useRef } from "react"
export function SessionChat(props: {
agent: string
sandboxId: string
threadId: string
initialMessages: UIMessage[]
resumeStreamOnMount: boolean
}) {
const {
agent,
sandboxId,
threadId,
initialMessages,
resumeStreamOnMount,
} = props
const chat = useMemo(
() =>
createAgentChat({
agent,
tokenUrl: "/api/an/token",
sandboxId,
threadId,
}),
[agent, sandboxId, threadId],
)
const { messages, sendMessage, status, stop, error, setMessages } = useChat({
chat,
resume: resumeStreamOnMount,
})
const hydratedRef = useRef(false)
useEffect(() => {
if (hydratedRef.current) return
hydratedRef.current = true
setMessages(initialMessages)
}, [initialMessages, setMessages])
return (
<AgentChat
messages={messages}
onSend={(message) =>
sendMessage({
role: "user",
parts: [{ type: "text", text: message.content }],
})
}
status={status}
onStop={stop}
error={error}
/>
)
}After hydration or resume, the client should keep sending the full updated messages array back to your server route on each turn.