lens-studio-camera-texture
Camera Textures
Lens Studio allows developers to intercept the background camera feed for customized rendering loops or image processing. On mobile phone lenses, this is straightforward, but on Spectacles, you must specifically address the correct camera IDs (e.g. Right_Color) since the device utilizes multi-camera input for rendering and tracking.
Official docs: Spectacles Home
Requesting the Background Feed
The CameraModule handles raw camera requests.
[!IMPORTANT] Because the Lens Studio Editor cannot simulate dual-camera passthrough, you should use
Default_Colorwhen running in the simulation, andRight_Colorwhen compiling for the actual Spectacles device.
@component
export class CameraFeedIntercept extends BaseScriptComponent {
@input
camModule: CameraModule;
start() {
const isEditor = global.deviceInfoSystem.isEditor();
const camID = isEditor ? CameraModule.CameraId.Default_Color : CameraModule.CameraId.Right_Color;
const camRequest = CameraModule.createCameraRequest();
camRequest.cameraId = camID;
// Optionally downsample the resolution to save performance metrics
// camRequest.imageSmallerDimension = isEditor ? 352 : 756;
// This Texture can now be applied to any material or UI element
const rawCameraTexture = this.camModule.requestCamera(camRequest);
}
}
Applying to a Material
Once you have retrieved the CameraTexture from the CameraModule, you can inject it directly into a standard Lens Studio Material. Just set the mainPass parameter where a sampler2D is expected (typically baseColor).
@input
targetMaterial: Material;
//...
this.targetMaterial.mainPass.baseColor = rawCameraTexture;
Cropping and Manipulating
If you are passing camera visual data into a SnapML model or a UI sub-view, you frequently need to crop it. You can supply the camera feed to a CropTextureProvider.
@input
screenCropTexture: Texture;
//...
const camTexControl = rawCameraTexture.control as CameraTextureProvider;
const cropTexControl = this.screenCropTexture.control as CropTextureProvider;
// Inject the live camera into the Crop generator
cropTexControl.inputTexture = rawCameraTexture;
// The screenCropTexture is now a modified subsection of the camera feed!
Reference Example
The script below is extracted from the official CompositeCameraTexture package to demonstrate a robust approach for toggling camera request properties based on the environment (Editor vs Device) for optimal development.
See the reference guide for details.
More from rolandsmeenk/lensstudioagents
lens-studio-scripting
Reference guide for the Lens Studio TypeScript component system — covering the @component, @input, @hint, @allowUndefined, and @label decorators, the BaseScriptComponent lifecycle (onAwake vs OnStartEvent, UpdateEvent, DelayedCallbackEvent one-shot and repeating timers, TurnOnEvent/TurnOffEvent, onDestroy), accessing components with getComponent (plus null-check patterns to fix 'cannot read property of null' errors), cross-TypeScript imports with getTypeName(), NativeLogger vs print, prefab instantiation (sync and async), SceneObject hierarchy queries, and enabling/disabling objects. Use this skill whenever writing or debugging any Lens Studio TypeScript script, wiring up scene objects, or fixing 'this is undefined' or null-reference errors — platform-agnostic (works for Spectacles and phone lenses).
12spectacles-lens-essentials
Reference guide for foundational Lens Studio patterns on Spectacles — covering the GestureModule (pinch down/up/strength, targeting, grab, phone-in-hand with correct TypeScript API), SIK components (PinchButton, DragInteractable, GrabInteractable, ScrollView), hand-tracking gestures, physics bodies/colliders/callbacks (including audio-on-collision), LSTween animation (position/scale/rotation/color tweens), prefab instantiation at runtime, materials (clone-before-modify), spatial anchors, on-device persistent storage (putString/getFloat), spatial images, and the Path Pioneer raycasting pattern. Use this skill for any Spectacles lens that needs interaction, motion, animation, physics, audio, or persistent local storage — including Essentials, Throw Lab, Spatial Persistence, Spatial Image Gallery, Path Pioneer, Public Speaker, Voice Playback, Material Library, and DJ Specs samples.
9lens-studio-world-query
Reference guide for world understanding and scoring in Lens Studio — covering WorldQueryModule HitTestSession (HitTestSessionOptions.filter for jitter smoothing, semantic surface classification for floor/wall/ceiling/table detection, null result handling, per-frame performance), SIK InteractionManager targeting interactor ray pattern, Physics.createGlobalProbe().rayCast for scene-collider hits with collision layer filtering, aligning objects to surface normals using quat.lookAt, and the LeaderboardModule (create/retrieve with TTL and OrderingType, submitScore, getLeaderboardInfo with UsersType.Global/Friends). Use this skill when detecting real floors/walls/tables to place AR content, raycasting for hover or interaction against scene objects, or adding a global in-lens leaderboard — differentiates from spectacles-lens-essentials (physics/SIK) and from spectacles-cloud (Supabase persistence).
8spectacles-cloud
Reference guide for Snap Cloud (Supabase-powered backend) in Spectacles lenses — covering Fetch API setup (requires Internet Access capability in Project Settings), Postgres REST queries with the anon key, Row Level Security policies, Realtime WebSocket subscriptions with correct postgres_changes event format and reconnect-on-sleep patterns, cloud storage uploads of base64 images captured by Spectacles, serverless Edge Functions, and companion web dashboard architecture. Use this skill whenever a lens needs persistent cloud data, needs to share data with a web app in real time, uploads captured images to a bucket, or calls a cloud function — covering Snap Cloud and World Kindness Day samples. Use spectacles-networking for plain REST calls to non-Snap backends, and spectacles-connected-lenses for in-session multiplayer state.
7lens-studio-materials-shaders
Reference guide for materials and shaders in Lens Studio — covering runtime material property changes (clone-before-modify, mainPass.baseColor, mainPass.opacity, mainPass.baseTex), blend modes (Normal/Alpha/Add/Screen/Multiply), depth and cull settings (depthTest, depthWrite, twoSided, cullMode), render order, material variants, assigning textures and render targets, reading and writing RenderTarget textures for post-processing, the graph-based Material Editor node system, custom shader graph nodes, and common shader pitfalls. Use this skill for any lens that needs to change material colors or textures at runtime, implement custom visual effects with shaders, set up post-processing render pipelines, chain render targets, or debug material/blend-mode issues — covering MaterialEditor, Drawing, and HairSimulation examples.
6spectacles-networking
Reference guide for the Lens Studio Fetch API and WebView component in Spectacles lenses — covering InternetModule (Lens Studio 5.9+), Fetch API via internetModule.fetch(Request) with bytes/text/json response handling, performHttpRequest, Internet Access capability, GET/POST requests, custom headers, Bearer auth, polling, timeouts, CORS/HTTPS, WebSocket and RemoteMediaModule for media from URLs, and bidirectional WebView messaging. Use this skill for any lens that calls a REST API, polls a JSON endpoint, loads remote images, embeds a webpage, or talks to a custom backend — including the Fetch sample. Use spectacles-ai for LLM/RSG calls, or spectacles-cloud for Supabase/Snap Cloud integration.
6