The Realtime API enables you to transform live video streams with minimal latency using WebRTC. Perfect for building camera effects, video conferencing filters, VR/AR applications, and interactive live streaming.
Quick Start
import { createDecartClient , models } from "@decartai/sdk" ;
const model = models . realtime ( "mirage_v2" );
// Get user's camera stream
const stream = await navigator . mediaDevices . getUserMedia ({
audio: true ,
video: {
frameRate: model . fps ,
width: model . width ,
height: model . height ,
},
});
// Create client and connect
const client = createDecartClient ({
apiKey: "your-api-key-here" ,
});
const realtimeClient = await client . realtime . connect ( stream , {
model ,
onRemoteStream : ( transformedStream ) => {
videoElement . srcObject = transformedStream ;
},
initialState: {
prompt: {
text: "Anime" ,
enhance: true ,
},
},
});
// Change style on the fly
realtimeClient . setPrompt ( "Cyberpunk city" );
// Disconnect when done
realtimeClient . disconnect ();
Client-Side Authentication
For browser applications, use client tokens instead of your permanent API key. Client tokens are short-lived tokens safe to expose in client-side code.
Learn more about client tokens and why they’re important for security.
Step 1: Create a backend endpoint
Your server creates client tokens using the SDK:
Express.js
Next.js API Route
import express from "express" ;
import { createDecartClient } from "@decartai/sdk" ;
const app = express ();
const client = createDecartClient ({
apiKey: process . env . DECART_API_KEY ,
});
// Endpoint to generate client tokens for authenticated users
app . post ( "/api/realtime-token" , async ( req , res ) => {
try {
const token = await client . tokens . create ();
res . json ( token );
} catch ( error ) {
console . error ( "Token generation error:" , error );
res . status ( 500 ). json ({ error: "Failed to generate token" });
}
});
app . listen ( 3000 );
Step 2: Use the client token in your frontend
Fetch the client token from your backend and use it to connect:
import { createDecartClient , models } from "@decartai/sdk" ;
async function connectToRealtime () {
// 1. Get client token from your backend
const tokenResponse = await fetch ( "/api/realtime-token" , { method: "POST" });
const { apiKey } = await tokenResponse . json ();
// 2. Get user's camera stream
const model = models . realtime ( "mirage_v2" );
const stream = await navigator . mediaDevices . getUserMedia ({
video: {
frameRate: model . fps ,
width: model . width ,
height: model . height ,
},
});
// 3. Connect using the client token
const client = createDecartClient ({
apiKey , // Use client token, not your permanent key!
});
const realtimeClient = await client . realtime . connect ( stream , {
model ,
onRemoteStream : ( transformedStream ) => {
document . getElementById ( "output-video" ). srcObject = transformedStream ;
},
initialState: {
prompt: { text: "Anime style" },
},
});
return realtimeClient ;
}
Client tokens only prevent new connections after expiry. Active WebRTC sessions continue working even after the token expires.
Connecting
Getting Camera Access
Request access to the user’s camera using the WebRTC getUserMedia API:
const stream = await navigator . mediaDevices . getUserMedia ({
audio: true ,
video: {
frameRate: 25 ,
width: 1280 ,
height: 704 ,
},
});
Use the model’s fps, width, and height properties to ensure optimal performance.
Establishing Connection
Connect to the Realtime API with your media stream:
const realtimeClient = await client . realtime . connect ( stream , {
model: models . realtime ( "mirage_v2" ),
onRemoteStream : ( transformedStream : MediaStream ) => {
// Display the transformed video
const videoElement = document . getElementById ( "output-video" );
videoElement . srcObject = transformedStream ;
},
initialState: {
prompt: {
text: "Lego World" ,
enhance: true , // Let Decart enhance the prompt (recommended)
},
},
});
Parameters:
stream (required) - MediaStream from getUserMedia
model (required) - Realtime model from models.realtime()
onRemoteStream (required) - Callback that receives the transformed video stream
initialState.prompt (required) - Initial style prompt
text - Style description
enhance - Whether to auto-enhance the prompt (default: true)
Managing Prompts
Change the transformation style dynamically without reconnecting:
// Simple prompt with automatic enhancement
realtimeClient . setPrompt ( "Anime style" );
// Custom detailed prompt without enhancement
realtimeClient . setPrompt (
"A detailed artistic style with vibrant colors and dramatic lighting" ,
{ enhance: false }
);
Parameters:
prompt (required) - Text description of desired style
options.enhance (optional) - Whether to enhance the prompt (default: true)
Prompt enhancement uses Decart’s AI to expand simple prompts for better results. Disable it if you want full control over the exact prompt.
Avatar Live
The Avatar Live model animates portrait images with audio input. Unlike other realtime models that transform camera streams, Avatar Live takes a static image and audio to generate animated video.
Connecting with Avatar Image
import { createDecartClient , models } from "@decartai/sdk" ;
const model = models . realtime ( "live_avatar" );
// Load avatar image (Blob, File, or URL string)
const avatarImage = await fetch ( "/portrait.jpg" ). then ( r => r . blob ());
const client = createDecartClient ({
apiKey: "your-api-key-here"
});
// Connect with avatar image (pass null for stream)
const realtimeClient = await client . realtime . connect ( null , {
model ,
avatar: { avatarImage },
onRemoteStream : ( animatedStream ) => {
videoElement . srcObject = animatedStream ;
}
});
Avatar Live doesn’t require a camera stream. Pass null as the first argument to connect().
Playing Audio
Send audio to animate the avatar:
// Play audio file
const audioFile = document . querySelector ( "input[type=file]" ). files [ 0 ];
await realtimeClient . playAudio ( audioFile );
// Play from ArrayBuffer
const audioBuffer = await audioFile . arrayBuffer ();
await realtimeClient . playAudio ( audioBuffer );
// Play recorded audio blob
await realtimeClient . playAudio ( recordedBlob );
Supported formats:
Blob - Audio blob (e.g., from MediaRecorder)
File - Audio file from file input
ArrayBuffer - Raw audio data
Avatar Behavior Prompts
Control how the avatar behaves and expresses:
// Set avatar behavior
await realtimeClient . setPrompt ( "Smile warmly and nod occasionally" );
// With prompt enhancement
await realtimeClient . setPrompt ( "Look excited" , { enhance: true });
Complete Avatar Live Example
import { createDecartClient , models , type DecartSDKError } from "@decartai/sdk" ;
async function setupAvatarLive () {
const model = models . realtime ( "live_avatar" );
const client = createDecartClient ({
apiKey: process . env . DECART_API_KEY
});
// Load avatar image
const avatarImage = await fetch ( "/portrait.jpg" ). then ( r => r . blob ());
const realtimeClient = await client . realtime . connect ( null , {
model ,
avatar: { avatarImage },
onRemoteStream : ( stream ) => {
document . getElementById ( "avatar-video" ). srcObject = stream ;
}
});
// Handle connection state
realtimeClient . on ( "connectionChange" , ( state ) => {
console . log ( `Connection: ${ state } ` );
});
// Handle errors
realtimeClient . on ( "error" , ( error : DecartSDKError ) => {
console . error ( "Error:" , error . message );
});
// Audio file input handler
document . getElementById ( "audio-input" ). addEventListener ( "change" , async ( e ) => {
const file = ( e . target as HTMLInputElement ). files [ 0 ];
if ( file ) {
await realtimeClient . playAudio ( file );
}
});
return realtimeClient ;
}
Connection State
Monitor and react to connection state changes:
// Check state synchronously
const isConnected = realtimeClient . isConnected (); // boolean
const state = realtimeClient . getConnectionState (); // "connected" | "connecting" | "disconnected"
// Listen to state changes
realtimeClient . on ( "connectionChange" , ( state ) => {
console . log ( `Connection state: ${ state } ` );
if ( state === "disconnected" ) {
// Handle disconnection
showReconnectButton ();
} else if ( state === "connected" ) {
// Handle successful connection
hideReconnectButton ();
}
});
Use this to update your UI and handle reconnection logic.
Error Handling
Handle errors with the error event:
import type { DecartSDKError } from "@decartai/sdk" ;
realtimeClient . on ( "error" , ( error : DecartSDKError ) => {
console . error ( "SDK error:" , error . code , error . message );
switch ( error . code ) {
case "INVALID_API_KEY" :
showError ( "Invalid API key. Please check your credentials." );
break ;
case "WEB_RTC_ERROR" :
showError ( "Connection error. Please check your network." );
break ;
case "MODEL_NOT_FOUND" :
showError ( "Model not found. Please check the model name." );
break ;
default :
showError ( `Error: ${ error . message } ` );
}
});
Error Codes:
INVALID_API_KEY - API key is invalid or missing
WEB_RTC_ERROR - WebRTC connection failed
MODEL_NOT_FOUND - Specified model doesn’t exist
INVALID_INPUT - Invalid input parameters
Session Management
Access the current session ID:
const sessionId = realtimeClient . sessionId ;
console . log ( `Current session: ${ sessionId } ` );
This can be useful for logging, analytics, or debugging.
Cleanup
Always disconnect when done to free up resources:
// Disconnect from the service
realtimeClient . disconnect ();
// Remove event listeners
realtimeClient . off ( "connectionChange" , onConnectionChange );
realtimeClient . off ( "error" , onError );
// Stop the local media stream
stream . getTracks (). forEach ( track => track . stop ());
Failing to disconnect can leave WebRTC connections open and waste resources.
Complete Example
Here’s a full application with all features:
import { createDecartClient , models , type DecartSDKError } from "@decartai/sdk" ;
async function setupRealtimeVideo () {
try {
// Get camera stream with optimal settings
const model = models . realtime ( "mirage_v2" );
const stream = await navigator . mediaDevices . getUserMedia ({
audio: true ,
video: {
frameRate: model . fps ,
width: model . width ,
height: model . height ,
},
});
// Display input video
const inputVideo = document . getElementById ( "input-video" ) as HTMLVideoElement ;
inputVideo . srcObject = stream ;
// Create client
const client = createDecartClient ({
apiKey: process . env . DECART_API_KEY ,
});
// Connect to Realtime API
const realtimeClient = await client . realtime . connect ( stream , {
model ,
onRemoteStream : ( transformedStream ) => {
const outputVideo = document . getElementById ( "output-video" ) as HTMLVideoElement ;
outputVideo . srcObject = transformedStream ;
},
initialState: {
prompt: {
text: "Studio Ghibli animation style" ,
enhance: true ,
},
},
});
// Handle connection state changes
realtimeClient . on ( "connectionChange" , ( state ) => {
const statusElement = document . getElementById ( "status" );
statusElement . textContent = `Status: ${ state } ` ;
statusElement . className = `status- ${ state } ` ;
});
// Handle errors
realtimeClient . on ( "error" , ( error : DecartSDKError ) => {
console . error ( "Realtime error:" , error );
const errorElement = document . getElementById ( "error" );
errorElement . textContent = error . message ;
errorElement . style . display = "block" ;
});
// Allow user to change styles
const styleInput = document . getElementById ( "style-input" ) as HTMLInputElement ;
styleInput . addEventListener ( "change" , ( e ) => {
const prompt = ( e . target as HTMLInputElement ). value ;
realtimeClient . setPrompt ( prompt , { enhance: true });
});
// Cleanup on page unload
window . addEventListener ( "beforeunload" , () => {
realtimeClient . disconnect ();
stream . getTracks (). forEach ( track => track . stop ());
});
return realtimeClient ;
} catch ( error ) {
console . error ( "Failed to setup realtime video:" , error );
throw error ;
}
}
// Initialize
setupRealtimeVideo ();
Best Practices
Use model properties for video constraints
Always use the model’s fps, width, and height properties when calling getUserMedia to ensure optimal performance and compatibility. const model = models . realtime ( "mirage_v2" );
const stream = await navigator . mediaDevices . getUserMedia ({
video: {
frameRate: model . fps ,
width: model . width ,
height: model . height ,
},
});
Enable prompt enhancement
For best results, keep enhance: true (default) to let Decart’s AI enhance your prompts. Only disable it if you need exact prompt control.
Handle connection state changes
Always listen to connectionChange events to update your UI and handle reconnection logic gracefully.
Always call disconnect() and stop media tracks when done to avoid memory leaks and unnecessary resource usage.
API Reference
client.realtime.connect(stream, options)
Connects to the realtime transformation service.
Parameters:
stream: MediaStream | null - MediaStream from getUserMedia, or null for Avatar Live
options.model: ModelDefinition - Realtime model from models.realtime()
options.onRemoteStream: (stream: MediaStream) => void - Callback for transformed video stream
options.initialState.prompt: { text: string; enhance?: boolean } - Initial transformation prompt (for video models)
options.avatar: { avatarImage: Blob | File | string } - Avatar options (for Avatar Live only)
Returns: Promise<RealtimeClient> - Connected realtime client instance
realtimeClient.setPrompt(prompt, options?)
Changes the transformation style.
Parameters:
prompt: string - Text description of desired style
options.enhance?: boolean - Whether to enhance the prompt (default: true)
realtimeClient.setImage(image)
Sets a reference image to guide the style transformation. The model will use this image as a visual reference for the transformation style. Only supported for Mirage models (mirage and mirage_v2).
Parameters:
image: Blob | File | string - Image as a Blob, File object, or URL string
Returns: Promise<void>
realtimeClient.isConnected()
Check if currently connected.
Returns: boolean
realtimeClient.getConnectionState()
Get current connection state.
Returns: "connected" | "connecting" | "disconnected"
realtimeClient.sessionId
The ID of the current realtime inference session.
Type: string
realtimeClient.disconnect()
Closes the connection and cleans up resources.
Avatar Live Methods
These methods are only available when using the live_avatar model.
realtimeClient.playAudio(audio)
Sends audio to animate the avatar.
Parameters:
audio: Blob | File | ArrayBuffer - Audio data to play
Returns: Promise<void> - Resolves when audio finishes playing
Events
connectionChange
Fired when connection state changes.
Callback: (state: "connected" | "connecting" | "disconnected") => void
error
Fired when an error occurs.
Callback: (error: DecartSDKError) => void
Next Steps