spectacles-lens-essentials
Spectacles Lens Essentials — Reference Guide
A compact reference for the most commonly used systems when building Spectacles lenses in Lens Studio.
Official docs: Spectacles Home · Features Overview · Spatial Design
GestureModule (Spectacles Gesture API)
The docs describe the Gesture Module as an ML-based API for reliable gesture detection (pinch, targeting, grab). Use it for raw events when you need more control than SIK components.
The GestureModule is the Spectacles-native API for reliable ML-based gesture detection. Use it for raw pinch, targeting, and grab events when you need more control than SIK's higher-level components offer.
@component
export class GestureExample extends BaseScriptComponent {
private gestureModule: GestureModule = require('LensStudio:GestureModule')
onAwake(): void {
// --- Pinch ---
this.gestureModule
.getPinchDownEvent(GestureModule.HandType.Right)
.add((args: PinchDownArgs) => {
// args.confidence: 0–1, how confident the model is
// args.palmOrientation: vec3, palm facing direction
print('Right pinch down, confidence: ' + args.confidence)
})
this.gestureModule
.getPinchStrengthEvent(GestureModule.HandType.Right)
.add((args: PinchStrengthArgs) => {
// args.strength: 0 = no pinch, 1 = full pinch
print('Pinch strength: ' + args.strength)
})
this.gestureModule
.getPinchUpEvent(GestureModule.HandType.Right)
.add((args: PinchUpArgs) => {
// args.palmOrientation: vec3
print('Right pinch up')
})
// Use GestureModule.HandType.Left, .Right, or .Both
}
}
Targeting Gesture (index finger pointing)
this.gestureModule
.getTargetingStartEvent(GestureModule.HandType.Right)
.add(() => print('Started pointing'))
this.gestureModule
.getTargetingEndEvent(GestureModule.HandType.Right)
.add(() => print('Stopped pointing'))
Grab Gesture (fist)
this.gestureModule
.getGrabStartEvent(GestureModule.HandType.Both)
.add(() => print('Grab started (either hand)'))
this.gestureModule
.getGrabEndEvent(GestureModule.HandType.Both)
.add(() => print('Grab released'))
Phone-in-Hand Detection
this.gestureModule
.getPhoneInHandEvent(GestureModule.HandType.Right)
.add(() => print('User is holding a phone in their right hand'))
GestureModule vs SIK: Use
GestureModulewhen you need raw events and confidence values. Use SIK'sPinchButton,DragInteractable, etc. when you want high-level UI components with built-in visual feedback.
Spectacles Interaction Kit (SIK)
SIK is Snap's prebuilt AR interaction library. Add it to a project via the Asset Library: search "Spectacles Interaction Kit". All SIK imports use the SpectaclesInteractionKit.lspkg package path.
Key SIK Components
| Component | Purpose |
|---|---|
HandInputData |
Access hand pose, finger positions, pinch state per frame |
PinchButton |
Trigger an action on pinch; works with either hand |
DragInteractable |
Make any scene object draggable by hand |
GrabInteractable |
Grab and move objects with a fist gesture |
ScrollView |
Scrollable UI list driven by hand swipe |
ToggleButton |
On/off button, syncs visual state |
ToggleButton
import { ToggleButton } from 'SpectaclesInteractionKit.lspkg/Components/UI/ToggleButton/ToggleButton'
const toggleButton = this.sceneObject.getComponent(ToggleButton.getTypeName()) as ToggleButton
// React to toggle events
toggleButton.onStateChanged.add((isOn: boolean) => {
print('Toggle is now: ' + (isOn ? 'ON' : 'OFF'))
lampObject.enabled = isOn
})
// Read current state
if (toggleButton.isToggledOn) {
print('Button is currently ON')
}
// Force a state programmatically
toggleButton.toggle()
ScrollView
import { ScrollView } from 'SpectaclesInteractionKit.lspkg/Components/UI/ScrollView/ScrollView'
const scrollView = this.sceneObject.getComponent(ScrollView.getTypeName()) as ScrollView
// Listen for scroll position changes
scrollView.onScrollPositionChanged.add((normalizedPos: number) => {
// normalizedPos: 0 = top, 1 = bottom
print('Scroll position: ' + normalizedPos)
updateVisibleItems(normalizedPos)
})
Reading hand position in script
import { HandInputData } from 'SpectaclesInteractionKit.lspkg/Providers/HandInputData/HandInputData'
const handData = HandInputData.getInstance()
const updateEvent = this.createEvent('UpdateEvent')
updateEvent.bind(() => {
const rightHand = handData.getDominantHand()
if (rightHand.isPinching()) {
const pinchPos = rightHand.getPinchPosition()
print('Pinch at: ' + JSON.stringify(pinchPos))
}
})
Physics
Lens Studio uses a Bullet-based physics engine. Components: Body, Collider, and Constraint.
Setting up a physics object
- Add a Physics Body component (static, kinematic, or dynamic).
- Add a Collider (Box, Sphere, Capsule, or Mesh).
- Dynamic objects respond to gravity and forces automatically.
Applying forces in script
const body = this.sceneObject.getComponent('Physics.BodyComponent')
// Apply an impulse at the object's center
body.applyImpulse(new vec3(0, 500, -200))
// Apply torque
body.applyTorqueImpulse(new vec3(0, 10, 0))
// Set velocity directly (useful for throwing)
body.velocity = velocity
body.angularVelocity = angularVel
Throw mechanics (from Throw Lab)
// Sample hand position over N frames, compute delta / dt
const velocity = (currentPos.sub(prevPos)).uniformScale(1 / getDeltaTime())
body.velocity = velocity.uniformScale(throwStrength)
Physics callbacks
body.onCollisionEnter.add((collision) => {
const other = collision.otherObject
print('Hit: ' + other.name)
if (collision.contacts.length > 0) {
const point = collision.contacts[0].position
spawnParticles(point)
}
})
Audio
Play audio
const audioComponent = this.sceneObject.getComponent('Component.AudioComponent')
audioComponent.audioTrack = myAudioTrack // assign in inspector or via script
audioComponent.play(1) // play once (pass 0 for loop)
audioComponent.stop()
Record and play back voice (from Voice Playback sample)
const voiceML = require('LensStudio:VoiceML')
let recordedBuffer: AudioBuffer | null = null
voiceML.startRecording((buffer: AudioBuffer) => {
recordedBuffer = buffer
})
voiceML.stopRecording()
if (recordedBuffer) {
audioComponent.playAudioBuffer(recordedBuffer)
}
Audio mixer channels
audioComponent.mixerChannel = 'Music' // or 'SFX', 'Voice'
Audio-reactive visuals with AudioSpectrum
AudioSpectrum gives you per-frame frequency band data from any AudioComponent — useful for visualisers, beat-reactive effects, or driving shader parameters.
const audioSpectrum = this.sceneObject.getComponent('Component.AudioSpectrumComponent')
const updateEvent = this.createEvent('UpdateEvent')
updateEvent.bind(() => {
// bands: Float32Array of frequency magnitudes (length depends on band count setting)
const bands = audioSpectrum.getBands()
const bass = bands[0] // low frequency (kick drum, bass)
const mid = bands[Math.floor(bands.length / 2)] // midrange
const high = bands[bands.length - 1] // high frequency (hi-hat, sibilance)
// Drive a VFX property or shader uniform:
vfxComponent.asset.properties['intensity'] = bass
mat.mainPass.baseColor = new vec4(mid, 0.2, high, 1.0)
// Drive an object's scale
const s = 1.0 + bass * 2.0
this.sceneObject.getTransform().setLocalScale(new vec3(s, s, s))
})
Set up
AudioSpectrumComponentin the Inspector: assign theAudioComponentsource, set band count (32 or 64 are common), and choose linear or logarithmic scale.
Animation with LSTween
LSTween (bundled in SIK) is a Lens Studio tween library for smooth property animation.
import { LSTween } from 'SpectaclesInteractionKit.lspkg/Utils/LSTween/LSTween'
// Move an object to a target position over 0.5 seconds
LSTween.moveToWorld(sceneObject, targetPosition, 0.5)
.easing(TWEEN.Easing.Quadratic.Out)
.start()
// Scale up
LSTween.scaleTo(sceneObject, new vec3(1, 1, 1), 0.3).start()
// Fade a screen image
LSTween.colorTo(screenImage, new vec4(1, 1, 1, 0), 0.4).start() // fade out
Chain tweens with .onComplete:
LSTween.moveTo(obj, posA, 0.5)
.onComplete(() => LSTween.moveTo(obj, posB, 0.5).start())
.start()
Materials & Shaders
Modifying material properties at runtime
const meshVisual = this.sceneObject.getComponent('Component.RenderMeshVisual')
const mat = meshVisual.material.clone() // clone so you don't affect other objects using same material
meshVisual.material = mat
mat.mainPass.baseColor = new vec4(1, 0, 0, 1) // red
mat.mainPass.opacity = 0.5
Spatial Images (2D → 3D)
const spatialImageModule = require('LensStudio:SpatialImageModule')
spatialImageModule.createSpatialImageFromTexture(myTexture, (spatialImage) => {
spatialImage.setParent(scene.getRootObject(0))
spatialImage.getTransform().setWorldPosition(targetPosition)
})
Spatial Anchors & Persistent Storage
const spatialAnchorModule = require('LensStudio:SpatialAnchorModule')
// Create an anchor at a world position
spatialAnchorModule.createAnchor(worldPosition, (anchor) => {
saveToStorage('my_anchor', anchor.id)
})
// Later: restore the anchor
const anchorId = loadFromStorage('my_anchor')
spatialAnchorModule.getAnchor(anchorId, (anchor) => {
sceneObject.getTransform().setWorldPosition(anchor.worldPosition)
})
Persistent Storage (on-device)
const storage = global.persistentStorageSystem
storage.store.putString('username', 'Roland')
storage.store.putFloat('highScore', 42.5)
const name = storage.store.getString('username')
const score = storage.store.getFloat('highScore')
Display & sizing (Spectacles)
From the Spatial Design docs: the displays achieve full overlap at 1.1 m from the user; at that distance the visible content area is ~1000×1397 px (~53×77 cm). The focus plane is at 1 m — place highly detailed content near this distance. The display is portrait ~3:4. Design for hands-free and natural interactions; the OS reserves space on the hand for a system button; the rest is available for your lens.
Common Gotchas
- GestureModule requires Spectacles — it is not available for phone lenses or in the desktop simulator.
- SIK components expect a specific scene hierarchy — read the SIK setup guide in its README before restructuring the scene.
- Physics and the World Mesh: enable the World Mesh Collider in Project Settings → World Understanding so physics objects land on real surfaces.
- Cloning materials: always call
material.clone()before modifying properties at runtime, otherwise all objects sharing that material change together. getDeltaTime()is your friend for frame-rate-independent motion.- Spatial anchors require the user to rescan the area if they move far away; give the user a visual "anchor not found" state.
- Audio latency: use pre-loaded
AudioTrackassets rather than loading from URL for low-latency sound effects.
Reference Examples
- HandAttacher.ts - Complete object joint binding script with SIK
HandInputDataand interpolation. - Example_ChainTween.ts - Demonstrates advanced
LSTweenchaining and API constraints.
More from rolandsmeenk/lensstudioagents
lens-studio-scripting
Reference guide for the Lens Studio TypeScript component system — covering the @component, @input, @hint, @allowUndefined, and @label decorators, the BaseScriptComponent lifecycle (onAwake vs OnStartEvent, UpdateEvent, DelayedCallbackEvent one-shot and repeating timers, TurnOnEvent/TurnOffEvent, onDestroy), accessing components with getComponent (plus null-check patterns to fix 'cannot read property of null' errors), cross-TypeScript imports with getTypeName(), NativeLogger vs print, prefab instantiation (sync and async), SceneObject hierarchy queries, and enabling/disabling objects. Use this skill whenever writing or debugging any Lens Studio TypeScript script, wiring up scene objects, or fixing 'this is undefined' or null-reference errors — platform-agnostic (works for Spectacles and phone lenses).
12lens-studio-world-query
Reference guide for world understanding and scoring in Lens Studio — covering WorldQueryModule HitTestSession (HitTestSessionOptions.filter for jitter smoothing, semantic surface classification for floor/wall/ceiling/table detection, null result handling, per-frame performance), SIK InteractionManager targeting interactor ray pattern, Physics.createGlobalProbe().rayCast for scene-collider hits with collision layer filtering, aligning objects to surface normals using quat.lookAt, and the LeaderboardModule (create/retrieve with TTL and OrderingType, submitScore, getLeaderboardInfo with UsersType.Global/Friends). Use this skill when detecting real floors/walls/tables to place AR content, raycasting for hover or interaction against scene objects, or adding a global in-lens leaderboard — differentiates from spectacles-lens-essentials (physics/SIK) and from spectacles-cloud (Supabase persistence).
8spectacles-cloud
Reference guide for Snap Cloud (Supabase-powered backend) in Spectacles lenses — covering Fetch API setup (requires Internet Access capability in Project Settings), Postgres REST queries with the anon key, Row Level Security policies, Realtime WebSocket subscriptions with correct postgres_changes event format and reconnect-on-sleep patterns, cloud storage uploads of base64 images captured by Spectacles, serverless Edge Functions, and companion web dashboard architecture. Use this skill whenever a lens needs persistent cloud data, needs to share data with a web app in real time, uploads captured images to a bucket, or calls a cloud function — covering Snap Cloud and World Kindness Day samples. Use spectacles-networking for plain REST calls to non-Snap backends, and spectacles-connected-lenses for in-session multiplayer state.
7lens-studio-materials-shaders
Reference guide for materials and shaders in Lens Studio — covering runtime material property changes (clone-before-modify, mainPass.baseColor, mainPass.opacity, mainPass.baseTex), blend modes (Normal/Alpha/Add/Screen/Multiply), depth and cull settings (depthTest, depthWrite, twoSided, cullMode), render order, material variants, assigning textures and render targets, reading and writing RenderTarget textures for post-processing, the graph-based Material Editor node system, custom shader graph nodes, and common shader pitfalls. Use this skill for any lens that needs to change material colors or textures at runtime, implement custom visual effects with shaders, set up post-processing render pipelines, chain render targets, or debug material/blend-mode issues — covering MaterialEditor, Drawing, and HairSimulation examples.
6spectacles-networking
Reference guide for the Lens Studio Fetch API and WebView component in Spectacles lenses — covering InternetModule (Lens Studio 5.9+), Fetch API via internetModule.fetch(Request) with bytes/text/json response handling, performHttpRequest, Internet Access capability, GET/POST requests, custom headers, Bearer auth, polling, timeouts, CORS/HTTPS, WebSocket and RemoteMediaModule for media from URLs, and bidirectional WebView messaging. Use this skill for any lens that calls a REST API, polls a JSON endpoint, loads remote images, embeds a webpage, or talks to a custom backend — including the Fetch sample. Use spectacles-ai for LLM/RSG calls, or spectacles-cloud for Supabase/Snap Cloud integration.
6spectacles-connected-lenses
Reference guide for real-time multiplayer AR on Spectacles using Connected Lenses and Spectacles Sync Kit — covering session creation/joining with joinOrCreateSession (including 'already-in-session' error handling), TransformSyncComponent for position/rotation replication, RealtimeStore for shared key-value state (max 512 bytes per key), NetworkEventSystem for one-shot broadcast events, EntityOwnership for physics authority, Lens Cloud for persistent cross-session data, and patterns for turn-based (Tic Tac Toe) and real-time physics (Air Hockey). Also covers late-joiner state sync, transform drift mitigation, and store size limits. Use this skill whenever multiple Spectacles users need to share AR objects or state — covering Tic Tac Toe, Air Hockey, Laser Pointer, High Five, Shared Sync Controls, Spectacles Sync Kit, and Think Out Loud samples.
5