lens-studio-vfx
Lens Studio VFX — Reference Guide
Lens Studio's VFX Graph system lets you build GPU-accelerated particle effects visually. Scripts control VFX by writing to the graph's exposed properties, not by calling imperative particle APIs.
Core Concepts
| Term | Meaning |
|---|---|
| VFX Asset | The .vfx file — the graph definition with exposed properties |
| VFXComponent | The scene component that runs a VFX Asset on a scene object |
| Properties | Named, typed inputs on the asset that scripts can read/write |
// Get the VFX component on the scene object
const vfx = this.sceneObject.getComponent('Component.VFXComponent')
// Check the asset is assigned before accessing properties
if (!vfx || !vfx.asset) {
print('[VFX] Asset not assigned!')
return
}
// Access the properties object
const props = vfx.asset.properties
Writing Properties to a VFX Graph
The VFX Graph exposes named properties that are declared inside the graph editor. Match the property name exactly (case-sensitive) and the type must match.
Supported property types
| Type | Example |
|---|---|
number |
props['intensity'] = 0.8 |
boolean |
props['emitting'] = true |
vec2 |
props['uvOffset'] = new vec2(0.1, 0.2) |
vec3 |
props['spawnPosition'] = new vec3(0, 1, 0) |
vec4 |
props['color'] = new vec4(1, 0.5, 0, 1) |
quat |
props['orientation'] = someQuat |
// Set a float intensity property
vfx.asset.properties['intensity'] = 0.5
// Set an emission position from another scene object
const sourcePos: vec3 = sourceObject.getTransform().getWorldPosition()
vfx.asset.properties['spawnPosition'] = sourcePos
// Set a colour via vec4 (RGBA, [0, 1])
vfx.asset.properties['color'] = new vec4(1.0, 0.3, 0.0, 1.0)
// Toggle emission on/off
vfx.asset.properties['emitting'] = true
Type mismatch warning: if the value type doesn't match what the graph expects, the property silently does nothing. Use
typeofor a debug print to confirm the type before writing.
Sending Transform Data to VFX
World position
updateEvent.bind(() => {
const pos = anchor.getTransform().getWorldPosition()
vfx.asset.properties['targetPosition'] = pos
})
World rotation (as vec4 to avoid quat-to-prop issues)
const rot = anchor.getTransform().getWorldRotation()
vfx.asset.properties['rotation'] = new vec4(rot.w, rot.x, rot.y, rot.z)
Direction vector
const forward = anchor.getTransform().forward // vec3
vfx.asset.properties['emitDirection'] = forward
AABB mesh dimensions (for bounding-box-shaped emitters)
const meshVisual = anchor.getComponent('Component.RenderMeshVisual')
const aabbMax = meshVisual.mesh.aabbMax.mult(anchor.getTransform().getLocalScale())
const aabbMin = meshVisual.mesh.aabbMin.mult(anchor.getTransform().getLocalScale())
vfx.asset.properties['boundsMax'] = aabbMax
vfx.asset.properties['boundsMin'] = aabbMin
Screen-Space to World-Space Conversion
To spawn particles at a 2D screen position (e.g., where the user tapped):
// Given: a ScreenTransform and a Camera
function screenToWorld(screenTrans: ScreenTransform, camera: Camera, desiredDepth: number): vec3 {
const anchors = screenTrans.anchors
const center = anchors.getCenter()
const anchorHalfHeight = anchors.getSize().y / 2
const fov = camera.fov
const depth = (desiredDepth / anchorHalfHeight) * 0.5 / Math.tan(fov * 0.5)
const screenPos = screenTrans.localPointToScreenPoint(center)
return camera.screenSpaceToWorldSpace(screenPos, depth)
}
// Usage: spawn VFX at the tapped screen position
vfx.asset.properties['spawnPosition'] = screenToWorld(screenTrans, mainCamera, 250)
Particle System Inspector Settings
These are configured in the Lens Studio VFX Graph editor, not in script — but knowing them helps when authoring the graph.
| Setting | Purpose |
|---|---|
| Emitter Rate | Particles spawned per second (or per burst) |
| Lifetime | How long each particle lives (in seconds) |
| Simulation Space | World: particles stay where emitted; Local: particles move with the emitter |
| Blend Mode | Additive for fire/glow, Alpha for soft particles, Normal for opaque |
| Shape | Sphere, Cone, Box, Mesh surface — defines the spawn volume |
World vs Local simulation: use World space for fire or smoke that should trail behind a moving emitter. Use Local for effects that rigidly follow the emitter (e.g., a sparkle around a held object).
Attaching VFX to Face / Body Tracking Anchors
// Parent the VFX object to a face tracking anchor in the scene hierarchy
// (drag it under the `hat` or `mouth` anchor in the inspector)
// Then in script, send expression weights to drive emission rate:
const faceTracking = headObject.getComponent('FaceTracking')
updateEvent.bind(() => {
const expressions = faceTracking.getFaceExpressionWeights()
const mouthOpen = expressions[FaceTracking.Expressions.MouthOpen]
// Drive particle emission rate by mouth opening
vfx.asset.properties['emitRate'] = mouthOpen * 50 // 0–50 particles/sec
vfx.asset.properties['emitting'] = mouthOpen > 0.3
})
SpawnOnTap Pattern
@component
export class SpawnOnTap extends BaseScriptComponent {
@input vfxPrefab: ObjectPrefab
onAwake(): void {
const tapEvent = this.createEvent('TapEvent')
tapEvent.bind(() => {
// Instantiate a new VFX instance at the tap position
const instance = this.vfxPrefab.instantiate(null)
// Position it (TapEvent gives screen position — use WorldQueryModule for 3D hit point)
instance.getTransform().setWorldPosition(spawnPos)
// Destroy after the effect finishes
const cleanup = this.createEvent('DelayedCallbackEvent')
cleanup.bind(() => instance.destroy())
cleanup.reset(3) // 3 seconds
})
}
}
Common Gotchas
- Property name must match exactly — property names in the VFX graph are case-sensitive. Always log
Object.keys(vfx.asset.properties)during development to see what's available. - Asset null check: always guard with
if (!vfx || !vfx.asset)— if the asset isn't assigned in the inspector, the property access will throw. - Type mismatch is silent — passing a
numberto avec3property won't error; it just won't work. Match the type exposed in the VFX graph. - Simulation space matters — a
Local-space emitter attached to moving content may produce unexpected particle trails; switch toWorldif that happens. - Writing properties every frame is fine — VFX property writes happen on the CPU; the GPU reads them at emit time. Don't worry about performance for single-value writes.
VFXComponentis notParticleSystem— Lens Studio's VFX Graph is separate from the legacy Particle System component. They have different APIs and can't be mixed.
More from rolandsmeenk/lensstudioagents
lens-studio-scripting
Reference guide for the Lens Studio TypeScript component system — covering the @component, @input, @hint, @allowUndefined, and @label decorators, the BaseScriptComponent lifecycle (onAwake vs OnStartEvent, UpdateEvent, DelayedCallbackEvent one-shot and repeating timers, TurnOnEvent/TurnOffEvent, onDestroy), accessing components with getComponent (plus null-check patterns to fix 'cannot read property of null' errors), cross-TypeScript imports with getTypeName(), NativeLogger vs print, prefab instantiation (sync and async), SceneObject hierarchy queries, and enabling/disabling objects. Use this skill whenever writing or debugging any Lens Studio TypeScript script, wiring up scene objects, or fixing 'this is undefined' or null-reference errors — platform-agnostic (works for Spectacles and phone lenses).
12spectacles-lens-essentials
Reference guide for foundational Lens Studio patterns on Spectacles — covering the GestureModule (pinch down/up/strength, targeting, grab, phone-in-hand with correct TypeScript API), SIK components (PinchButton, DragInteractable, GrabInteractable, ScrollView), hand-tracking gestures, physics bodies/colliders/callbacks (including audio-on-collision), LSTween animation (position/scale/rotation/color tweens), prefab instantiation at runtime, materials (clone-before-modify), spatial anchors, on-device persistent storage (putString/getFloat), spatial images, and the Path Pioneer raycasting pattern. Use this skill for any Spectacles lens that needs interaction, motion, animation, physics, audio, or persistent local storage — including Essentials, Throw Lab, Spatial Persistence, Spatial Image Gallery, Path Pioneer, Public Speaker, Voice Playback, Material Library, and DJ Specs samples.
9lens-studio-world-query
Reference guide for world understanding and scoring in Lens Studio — covering WorldQueryModule HitTestSession (HitTestSessionOptions.filter for jitter smoothing, semantic surface classification for floor/wall/ceiling/table detection, null result handling, per-frame performance), SIK InteractionManager targeting interactor ray pattern, Physics.createGlobalProbe().rayCast for scene-collider hits with collision layer filtering, aligning objects to surface normals using quat.lookAt, and the LeaderboardModule (create/retrieve with TTL and OrderingType, submitScore, getLeaderboardInfo with UsersType.Global/Friends). Use this skill when detecting real floors/walls/tables to place AR content, raycasting for hover or interaction against scene objects, or adding a global in-lens leaderboard — differentiates from spectacles-lens-essentials (physics/SIK) and from spectacles-cloud (Supabase persistence).
8spectacles-cloud
Reference guide for Snap Cloud (Supabase-powered backend) in Spectacles lenses — covering Fetch API setup (requires Internet Access capability in Project Settings), Postgres REST queries with the anon key, Row Level Security policies, Realtime WebSocket subscriptions with correct postgres_changes event format and reconnect-on-sleep patterns, cloud storage uploads of base64 images captured by Spectacles, serverless Edge Functions, and companion web dashboard architecture. Use this skill whenever a lens needs persistent cloud data, needs to share data with a web app in real time, uploads captured images to a bucket, or calls a cloud function — covering Snap Cloud and World Kindness Day samples. Use spectacles-networking for plain REST calls to non-Snap backends, and spectacles-connected-lenses for in-session multiplayer state.
7lens-studio-materials-shaders
Reference guide for materials and shaders in Lens Studio — covering runtime material property changes (clone-before-modify, mainPass.baseColor, mainPass.opacity, mainPass.baseTex), blend modes (Normal/Alpha/Add/Screen/Multiply), depth and cull settings (depthTest, depthWrite, twoSided, cullMode), render order, material variants, assigning textures and render targets, reading and writing RenderTarget textures for post-processing, the graph-based Material Editor node system, custom shader graph nodes, and common shader pitfalls. Use this skill for any lens that needs to change material colors or textures at runtime, implement custom visual effects with shaders, set up post-processing render pipelines, chain render targets, or debug material/blend-mode issues — covering MaterialEditor, Drawing, and HairSimulation examples.
6spectacles-networking
Reference guide for the Lens Studio Fetch API and WebView component in Spectacles lenses — covering InternetModule (Lens Studio 5.9+), Fetch API via internetModule.fetch(Request) with bytes/text/json response handling, performHttpRequest, Internet Access capability, GET/POST requests, custom headers, Bearer auth, polling, timeouts, CORS/HTTPS, WebSocket and RemoteMediaModule for media from URLs, and bidirectional WebView messaging. Use this skill for any lens that calls a REST API, polls a JSON endpoint, loads remote images, embeds a webpage, or talks to a custom backend — including the Fetch sample. Use spectacles-ai for LLM/RSG calls, or spectacles-cloud for Supabase/Snap Cloud integration.
6