audio-video
Audio and Video in Decentraland
When to Use Which Media Component
| Need | Component | Key Difference |
|---|---|---|
| Sound effect from a file (click, explosion, footstep) | AudioSource |
Local file, spatial, one-shot or looping |
| Background music or radio stream | AudioStream |
External URL, non-spatial, continuous |
| Video on a surface (screen, billboard) | VideoPlayer + Material.Texture.Video |
Requires a mesh to display on |
Decision flow:
- Is it a local audio file? →
AudioSource - Is it a streaming URL (radio, live audio)? →
AudioStream - Is it video content? →
VideoPlayeron a plane/mesh
Audio Source (Sound Effects & Music)
Play audio clips from files:
import { engine, Transform, AudioSource } from '@dcl/sdk/ecs'
import { Vector3 } from '@dcl/sdk/math'
const speaker = engine.addEntity()
Transform.create(speaker, { position: Vector3.create(8, 1, 8) })
AudioSource.create(speaker, {
audioClipUrl: 'assets/scene/Audio/music.mp3',
playing: true,
loop: true,
volume: 0.5, // 0 to 1
pitch: 1.0, // Playback speed (0.5 = half speed, 2.0 = double)
})
Supported Formats
.mp3(recommended).ogg.wav
File Organization
project/
├── assets/
│ └── scene/
│ └── Audio/
│ ├── click.mp3
│ ├── background-music.mp3
│ └── explosion.ogg
├── src/
│ └── index.ts
└── scene.json
Play/Stop/Toggle
// Play
AudioSource.getMutable(speaker).playing = true
// Stop
AudioSource.getMutable(speaker).playing = false
// Toggle
const audio = AudioSource.getMutable(speaker)
audio.playing = !audio.playing
Play on Click
import { pointerEventsSystem, InputAction } from '@dcl/sdk/ecs'
const button = engine.addEntity()
// ... set up transform and mesh ...
const audioEntity = engine.addEntity()
Transform.create(audioEntity, { position: Vector3.create(8, 1, 8) })
AudioSource.create(audioEntity, {
audioClipUrl: 'assets/scene/Audio/click.mp3',
playing: false,
loop: false,
volume: 0.8,
})
pointerEventsSystem.onPointerDown(
{
entity: button,
opts: { button: InputAction.IA_POINTER, hoverText: 'Play sound' },
},
() => {
// Reset and play
const audio = AudioSource.getMutable(audioEntity)
audio.playing = false
audio.playing = true
}
)
Audio Streaming
Stream audio from a URL (radio, live streams).
Before adding a streaming URL: If the URL wasn't provided by the user, confirm the source before adding it — e.g., "I'd reference the stream at [URL]. Is that the source you want?" See
agent-behaviors.mdinoverview/for the full confirmation pattern.
import { engine, Transform, AudioStream } from '@dcl/sdk/ecs'
import { Vector3 } from '@dcl/sdk/math'
const radio = engine.addEntity()
Transform.create(radio, { position: Vector3.create(8, 1, 8) })
AudioStream.create(radio, {
url: 'https://example.com/stream.mp3',
playing: true,
volume: 0.3,
})
AudioStream State
Query the current state of an audio stream with AudioStream.getAudioState and the MediaState enum — useful for reacting to buffering, errors, or end-of-stream:
import { AudioStream, MediaState } from '@dcl/sdk/ecs'
const state = AudioStream.getAudioState(radio)
if (state === MediaState.MS_PLAYING) {
console.log('Stream is playing')
} else if (state === MediaState.MS_ERROR) {
console.log('Stream error occurred')
}
// Monitor state changes in a system
let lastState: MediaState | undefined = undefined
engine.addSystem(() => {
const current = AudioStream.getAudioState(radio)
if (lastState !== current) {
console.log('Stream state changed:', current)
lastState = current
}
})
Video Player
Play video on a surface. If the video URL wasn't provided by the user, confirm before referencing it (same pattern as audio streams above).
import {
engine,
Transform,
VideoPlayer,
Material,
MeshRenderer,
} from '@dcl/sdk/ecs'
import { Vector3 } from '@dcl/sdk/math'
// Create a screen
const screen = engine.addEntity()
Transform.create(screen, {
position: Vector3.create(8, 3, 15.9),
scale: Vector3.create(8, 4.5, 1), // 16:9 ratio
})
MeshRenderer.setPlane(screen)
// Add video player
VideoPlayer.create(screen, {
src: 'https://example.com/video.mp4',
playing: true,
loop: true,
volume: 0.5,
playbackRate: 1.0,
position: 0, // Start time in seconds
})
// Create video texture
const videoTexture = Material.Texture.Video({ videoPlayerEntity: screen })
// Basic material (recommended — better performance)
Material.setBasicMaterial(screen, {
texture: videoTexture,
})
Video Controls
// Play
VideoPlayer.getMutable(screen).playing = true
// Pause
VideoPlayer.getMutable(screen).playing = false
// Change volume
VideoPlayer.getMutable(screen).volume = 0.8
// Change source
VideoPlayer.getMutable(screen).src = 'https://example.com/other.mp4'
Enhanced Video Material (PBR)
Ideally video screens should use basic (unlit) materials, that way they're always bright and crisp.
const videoTexture = Material.Texture.Video({ videoPlayerEntity: screen })
Material.setBasicMaterial(screen, {
texture: videoTexture,
})
For a brighter, emissive video screen:
import { Color3 } from '@dcl/sdk/math'
const videoTexture = Material.Texture.Video({ videoPlayerEntity: screen })
Material.setPbrMaterial(screen, {
texture: videoTexture,
roughness: 1.0,
specularIntensity: 0,
metallic: 0,
emissiveTexture: videoTexture,
emissiveIntensity: 0.6,
emissiveColor: Color3.White(),
})
Video Events
Monitor video playback state:
import { videoEventsSystem, VideoState } from '@dcl/sdk/ecs'
videoEventsSystem.registerVideoEventsEntity(screen, (videoEvent) => {
switch (videoEvent.state) {
case VideoState.VS_PLAYING:
console.log('Video started playing')
break
case VideoState.VS_PAUSED:
console.log('Video paused')
break
case VideoState.VS_READY:
console.log('Video ready to play')
break
case VideoState.VS_ERROR:
console.log('Video error occurred')
break
}
})
Spatial Audio
Audio from the AudioSource component in Decentraland is spatial by default — it gets louder as the player approaches the audio source entity and quieter as they move away. The position is determined by the entity's Transform. You can change this by setting the global property to true.
AudioSource.create(sourceEntity, {
audioClipUrl: 'assets/scene/Audio/music.mp3',
playing: true,
global: true,
})
Audio from the VideoPlayer component and the AudioStream component is global by default. Set it to spatial by setting the spatial to true. You can also change these properties:
-
spatialMinDistance: The minimum distance at which audio becomes spatial. If the player is closer, the audio will be heard at full volume. 0 by default.
-
spatialMaxDistance: The maximum distance at which the audio is heard. If the player is further away, the audio will be heard at 0 volume. 60 by default
VideoPlayer.create(videoPlayerEntity, {
src: 'https://player.vimeo.com/progressive_redirect/playback/1145666916/rendition/540p/file.mp4%20%28540p%29.mp4?loc=external&signature=db1cd6946851313cb8f7be60d1f6c30af0902bcc46fdae0ba2a06e5fdf44c329',
playing: true,
spatial: true,
spatialMinDistance: 5,
spatialMaxDistance: 10,
})
AudioStream.create(audioStreamEntity, {
url: 'https://radioislanegra.org/listen/up/stream',
playing: true,
volume: 1.0,
spatial: true,
spatialMinDistance: 5,
spatialMaxDistance: 10,
})
Free Audio Files
Always check the audio catalog before creating placeholder sound file references. It contains 50 free sounds from the Creator Hub asset packs.
Read {baseDir}/references/audio-catalog.md for music tracks (ambient, dance, medieval, sci-fi, etc.), ambient sounds (birds, city, factory, etc.), interaction sounds (buttons, doors, levers, chests), sound effects (explosions, sirens, bells), and game mechanic sounds (win/lose, heal, respawn, damage).
To use a catalog sound:
# Download from catalog
mkdir -p assets/scene/Audio
curl -o assets/scene/Audio/ambient_1.mp3 "https://builder-items.decentraland.org/contents/bafybeic4faewxkdqx67dloyw57ikgaeibc2e2dbx34hwjubl3gfvs2r4su"
// Reference in code — must be a local file path
AudioSource.create(entity, {
audioClipUrl: 'assets/scene/Audio/ambient_1.mp3',
playing: true,
loop: true,
})
How to suggest audio
- Read the audio catalog file
- Search for sounds matching the user's description/theme
- Suggest specific sounds with download commands
- Download selected sounds into the scene's
assets/scene/Audio/directory - Reference them in code with local paths
Important:
AudioSourceonly works with local files. Never use external URLs for theaudioClipUrlfield. Always download audio intoassets/scene/Audio/first.
Video State Polling
Check video playback state programmatically:
import { videoEventsSystem, VideoState } from '@dcl/sdk/ecs'
engine.addSystem(() => {
const state = videoEventsSystem.getVideoState(videoEntity)
if (state) {
console.log('Video state:', state.state) // VideoState.VS_PLAYING, VS_PAUSED, etc.
console.log('Current time:', state.currentOffset)
}
})
Audio Playback Events
Use the AudioEvent component to detect audio state changes:
import { AudioEvent } from '@dcl/sdk/ecs'
engine.addSystem(() => {
const event = AudioEvent.getOrNull(audioEntity)
if (event) {
console.log('Audio state:', event.state) // playing, paused, finished
}
})
Permission for External Media
External audio/video URLs require the ALLOW_MEDIA_HOSTNAMES permission in scene.json:
{
"requiredPermissions": ["ALLOW_MEDIA_HOSTNAMES"],
"allowedMediaHostnames": ["stream.example.com", "cdn.example.com"]
}
Multiple Video Surfaces
Share one VideoPlayer across multiple screens by referencing the same videoPlayerEntity:
Material.setPbrMaterial(screen1, {
texture: Material.Texture.Video({ videoPlayerEntity: videoEntity }),
})
Material.setPbrMaterial(screen2, {
texture: Material.Texture.Video({ videoPlayerEntity: videoEntity }),
})
Play a video on a glTF model
You may want to play a video on a shape that is not a primitive, to have curved screens or exotic shapes. Use GltfNodeModifiers to swap the material of a GLTF model.
VideoPlayer.create(myEntity, {
src: 'https://player.vimeo.com/external/552481870.m3u8?s=c312c8533f97e808fccc92b0510b085c8122a875',
playing: true,
})
GltfNodeModifiers.create(myEntity, {
modifiers: [
{
path: '',
material: {
material: {
$case: 'pbr',
pbr: {
texture: Material.Texture.Video({
videoPlayerEntity: myEntity,
}),
},
},
},
},
],
})
Video Limits & Tips
- Simultaneous videos: Always avoid playing multiple videos at once. Only play more than 1 simultaneous video if explicitly requested. Maximum 5 simultaneous videos.
- Distance-based control: Pause video when player is far away to save bandwidth
- Supported formats:
.mp4(H.264),.webm, HLS (.m3u8) for live streaming - Live streaming: Use HLS (
.m3u8) URLs — most reliable across clients
For full component field details, supported formats, and advanced patterns, see {baseDir}/references/media-reference.md.
Important Notes
- Audio files must be in the project's directory (relative paths from project root)
- Video requires HTTPS URLs — HTTP won't work
- Players must interact with the scene (click) before audio can play (browser autoplay policy)
- Keep audio files small — large files increase scene load time
- Use
.mp3for music and.oggfor sound effects (smaller file sizes) - For live video streaming, use HLS (.m3u8) URLs when possible
- If an audio file needs to be ready to play as the player interacts, use the
AssetLoadcomponent to pre-load the asset.
More from decentraland/sdk-skills
decentraland-sdk-skills
Build, extend, and deploy Decentraland SDK7 scenes. Contains agent behavioral guidelines, the composite-first rule, and an index of all topic skills with reference files. Install with a single command — no flags needed.
18build-ui
Build 2D screen-space UI for Decentraland scenes using React-ECS (JSX). Create HUDs, menus, health bars, scoreboards, dialogs, buttons, inputs, and dropdowns. Use when the user wants screen overlays, on-screen UI, HUD elements, menus, or form inputs. Do NOT use for 3D in-world text (see advanced-rendering) or clickable 3D objects (see add-interactivity).
3advanced-input
System-level input polling and player movement control in Decentraland. Covers inputSystem (isTriggered/isPressed for held keys, WASD polling), InputModifier (freeze/restrict player movement), PointerLock (cursor capture detection), PrimaryPointerInfo (cursor screen coords and world ray), and number-key action bar patterns. Use when the user wants continuous key polling, WASD-controlled entities, to freeze the player during a cutscene, FPS-style cursor lock, or multi-key combo patterns. For event-driven clicks and hover on entities see add-interactivity.
3nft-blockchain
NFT display and blockchain interaction in Decentraland. NftShape (framed NFT artwork), wallet checks (getPlayer, isGuest), signedFetch (authenticated requests), smart contract interaction (eth-connect, createEthereumProvider), and RPC calls. Use when the user wants NFTs, blockchain, wallet, smart contracts, Web3, crypto, or token gating. Do NOT use for player avatar data or emotes (see player-avatar).
3player-physics
Apply physics forces to the player in Decentraland scenes. Impulses (one-shot pushes), knockback (push away from a point with falloff), continuous forces (wind tunnels), timed forces, and repulsion fields. Use when the user wants launch pads, knockback on hit, wind zones, gravity fields, or any scene-applied force on the player. Do NOT use for player movement speed (see player-avatar) or platform movement (see animations-tweens).
3camera-control
Control camera behavior in Decentraland scenes. CameraMode detection (first/third person, onChange listener), CameraModeArea (force a mode inside a box), VirtualCamera (cinematic scripted cameras with Speed/Time transitions and lookAtEntity), MainCamera (activate/deactivate virtual cameras), and camera vs collider interactions (CL_PHYSICS + CL_POINTER). Use when the user wants camera control, cutscenes, cinematic views, forced camera modes, or camera tracking. Do NOT use for input restriction during cutscenes (see advanced-input for InputModifier) or cursor lock detection (see advanced-input for PointerLock).
3