spectacles-networking
Spectacles Networking — Reference Guide
Two main mechanisms for network access in Spectacles lenses:
- InternetModule — HTTP/HTTPS via the Fetch API (recommended) or performHttpRequest. From Lens Studio 5.9, these APIs live in
InternetModule; prior to 5.9 they were inRemoteServiceModule. - Web View — embed a web page in the AR scene, with bidirectional JS messaging.
The Remote Service Gateway (RSG) is a third option but is specific to AI/cloud-model calls — see the spectacles-ai skill for that.
Official docs: Internet Access | Spectacles Home
Setup
- Enable Internet Access: Project Settings → Capabilities → ✅ Internet Access. Without this, all requests fail on-device.
- Fetch API: Spectacles OS v5.58.6621+ and Lens Studio v5.3+. Use InternetModule (Lens Studio 5.9+): add the module to your project and call
internetModule.fetch(request). - HTTPS is required for publishable lenses; HTTP requires Experimental APIs and cannot be published. Never ship a lens using HTTP, even in preview builds shared with others — switch to HTTPS before distributing.
- Fetch only works in Preview when Device Type Override is set to Spectacles.
Fetch API (InternetModule)
Use InternetModule and construct a Request, then call internetModule.fetch(request). Response bodies use response.text(), response.json(), or response.bytes() (not body, blob, or arrayBuffer).
Basic GET
const internetModule = require('LensStudio:InternetModule')
// In a component: get controller via internetModule.getModule() or add InternetModule as @input
this.createEvent('OnStartEvent').bind(async () => {
const request = new Request('https://api.example.com/data', { method: 'GET' })
const response = await internetModule.fetch(request)
if (response.status === 200) {
const data = await response.json()
print(JSON.stringify(data))
}
})
POST with JSON body
const request = new Request('https://api.example.com/submit', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ key: 'value' })
})
const response = await this.internetModule.fetch(request)
if (!response.ok) { print('HTTP error: ' + response.status); return }
const data = await response.json()
With authentication headers
const request = new Request('https://api.example.com/secure', {
headers: {
'Authorization': 'Bearer ' + myToken,
'Content-Type': 'application/json'
}
})
const response = await this.internetModule.fetch(request)
Async / Await Pattern
@component
export class FetchExample extends BaseScriptComponent {
private internetModule = require('LensStudio:InternetModule')
async onAwake() {
try {
const request = new Request('https://api.example.com/items')
const response = await this.internetModule.fetch(request)
if (!response.ok) {
print('HTTP error: ' + response.status)
return
}
const items: any[] = await response.json()
this.displayItems(items)
} catch (e) {
print('Network error: ' + e)
}
}
displayItems(items: any[]): void {
// update scene objects
}
}
Handling Responses
On Spectacles, use bytes(), text(), and json() for response bodies (not body, blob, or arrayBuffer).
| Method | Returns | Use for |
|---|---|---|
response.json() |
Promise<any> |
JSON APIs |
response.text() |
Promise<string> |
Plain text or HTML |
response.bytes() |
Supported | Binary data (images, audio) |
response.status |
number |
HTTP status code |
response.ok |
boolean |
true if status 200–299 |
response.headers.get('Content-Type') |
string | null |
Read a response header |
Always check response.ok before parsing:
if (!response.ok) {
print('HTTP error: ' + response.status)
return
}
performHttpRequest and Remote Media
For a simpler callback-based API, use internetModule.performHttpRequest(RemoteServiceHttpRequest.create(), callback). For downloading images, video, glTF, or audio from URLs, use internetModule.makeResourceFromUrl(url) and then RemoteMediaModule.loadResourceAsImageTexture() (or loadResourceAsVideoTexture, loadResourceAsGltfAsset, loadResourceAsAudioTrackAsset). See Internet Access for full examples. Use global.deviceInfoSystem.isInternetAvailable() and onInternetStatusChanged to react to connectivity changes.
Multipart Form Data (File Upload)
async function uploadFile(url: string, filename: string, blob: Blob): Promise<void> {
const formData = new FormData()
formData.append('file', blob, filename)
const response = await fetch(url, {
method: 'POST',
headers: { 'Authorization': 'Bearer ' + myToken },
body: formData
// Do NOT set Content-Type manually — fetch sets it with the correct boundary
})
if (!response.ok) print('Upload failed: ' + response.status)
}
Web View
Web View renders a full web page into a texture that can be applied to any mesh in the scene.
Setup in Lens Studio
- Add a Web View component to a Scene Object.
- Set the URL in the component inspector, or set it at runtime via script.
- Apply the Web View texture to a Screen Image or a Mesh Visual material.
Scripting the Web View
// Set URL at runtime
webViewComponent.setUrl('https://myapp.example.com')
// Inject JavaScript into the page
// ⚠️ Only call evaluateJavaScript on pages you control.
// Never inject JS if the URL was provided by a user or an external API.
webViewComponent.evaluateJavaScript("document.getElementById('status').textContent = 'AR connected'")
// Receive messages from the page (the page calls window.lensStudioMessage(data))
webViewComponent.onMessageReceived.add((message: string) => {
const data = JSON.parse(message)
// react to data from the web page
})
Sending messages from web page to lens
In the web page's JavaScript:
// This calls back into the lens
window.lensStudioMessage(JSON.stringify({ action: 'buttonClicked', id: 42 }))
Hardened Fetch Helper
Combines timeout, status check, and safe JSON parsing in one reusable function:
async function fetchJSON<T>(
internetModule: any,
url: string,
options: RequestInit = {},
timeoutMs = 10000
): Promise<T | null> {
const timeoutPromise = new Promise<never>((_, reject) =>
setTimeout(() => reject(new Error('Request timed out')), timeoutMs)
)
try {
const response = await Promise.race([
internetModule.fetch(new Request(url, options)),
timeoutPromise
]) as Response
if (!response.ok) {
print(`[Fetch] HTTP ${response.status} for ${url}`)
return null
}
return await response.json() as T
} catch (e) {
print('[Fetch] Error: ' + e)
return null
}
}
// Usage:
const data = await fetchJSON<{ items: any[] }>(this.internetModule, 'https://api.example.com/items')
if (data) displayItems(data.items)
Common Patterns
Poll for updates on a timer
let elapsed: number = 0
const POLL_INTERVAL = 5 // seconds
const updateEvent = this.createEvent('UpdateEvent')
updateEvent.bind(() => {
elapsed += getDeltaTime()
if (elapsed >= POLL_INTERVAL) {
elapsed = 0
this.pollRemoteData()
}
})
Cache responses to avoid thrashing the network
let cachedData: any = null
let cacheTime: number = 0
const CACHE_TTL = 30 // seconds
async function getWithCache(url: string): Promise<any> {
const now = getTime()
if (cachedData && (now - cacheTime) < CACHE_TTL) return cachedData
const r = await fetch(url)
if (!r.ok) throw new Error('HTTP ' + r.status) // validate before parsing
cachedData = await r.json()
cacheTime = now
return cachedData
}
Request timeout with Promise.race
function fetchWithTimeout(url: string, timeoutMs: number): Promise<Response> {
const timeoutPromise = new Promise<never>((_, reject) =>
setTimeout(() => reject(new Error('Request timed out')), timeoutMs)
)
return Promise.race([fetch(url), timeoutPromise])
}
Permissions & Privacy
Combining internet connectivity with camera, microphone, or location in the same lens triggers Snap's Transparent Permission system: the OS shows the user a consent dialog on launch, and the device LED blinks while sensitive data is captured. Exception: calls via the Remote Service Gateway (RSG) do not count as external connectivity — you can use RSG with camera/microphone without triggering the Transparent Permission prompt.
Common Gotchas
- InternetModule (Lens Studio 5.9+): Use
require('LensStudio:InternetModule')andinternetModule.fetch(request); globalfetchmay not be available. Enable Internet Access in Project Settings → Capabilities or network calls fail. - Request constructor: Does not support taking another
Requestas input; use a URL string. - Response bodies: Use
response.bytes(),response.text(),response.json()—body,blob,arrayBufferare not supported. - Always check
response.okbefore calling.json()or.text()— a non-200 response body may not be valid JSON and will throw. - CORS: Web APIs must allow Spectacles' origin. HTTPS required for publishable lenses; HTTP requires Experimental APIs and cannot be published.
- Fetch does not time out automatically — wrap with
Promise.raceif your backend might be slow. - Web View performance: Large or animation-heavy web pages can hurt framerate. Prefer static or lightly interactive pages.
- Multipart uploads: do not set
Content-Typemanually when usingFormData— fetch sets it with the correctboundary.
More from rolandsmeenk/lensstudioagents
lens-studio-scripting
Reference guide for the Lens Studio TypeScript component system — covering the @component, @input, @hint, @allowUndefined, and @label decorators, the BaseScriptComponent lifecycle (onAwake vs OnStartEvent, UpdateEvent, DelayedCallbackEvent one-shot and repeating timers, TurnOnEvent/TurnOffEvent, onDestroy), accessing components with getComponent (plus null-check patterns to fix 'cannot read property of null' errors), cross-TypeScript imports with getTypeName(), NativeLogger vs print, prefab instantiation (sync and async), SceneObject hierarchy queries, and enabling/disabling objects. Use this skill whenever writing or debugging any Lens Studio TypeScript script, wiring up scene objects, or fixing 'this is undefined' or null-reference errors — platform-agnostic (works for Spectacles and phone lenses).
12spectacles-lens-essentials
Reference guide for foundational Lens Studio patterns on Spectacles — covering the GestureModule (pinch down/up/strength, targeting, grab, phone-in-hand with correct TypeScript API), SIK components (PinchButton, DragInteractable, GrabInteractable, ScrollView), hand-tracking gestures, physics bodies/colliders/callbacks (including audio-on-collision), LSTween animation (position/scale/rotation/color tweens), prefab instantiation at runtime, materials (clone-before-modify), spatial anchors, on-device persistent storage (putString/getFloat), spatial images, and the Path Pioneer raycasting pattern. Use this skill for any Spectacles lens that needs interaction, motion, animation, physics, audio, or persistent local storage — including Essentials, Throw Lab, Spatial Persistence, Spatial Image Gallery, Path Pioneer, Public Speaker, Voice Playback, Material Library, and DJ Specs samples.
9lens-studio-world-query
Reference guide for world understanding and scoring in Lens Studio — covering WorldQueryModule HitTestSession (HitTestSessionOptions.filter for jitter smoothing, semantic surface classification for floor/wall/ceiling/table detection, null result handling, per-frame performance), SIK InteractionManager targeting interactor ray pattern, Physics.createGlobalProbe().rayCast for scene-collider hits with collision layer filtering, aligning objects to surface normals using quat.lookAt, and the LeaderboardModule (create/retrieve with TTL and OrderingType, submitScore, getLeaderboardInfo with UsersType.Global/Friends). Use this skill when detecting real floors/walls/tables to place AR content, raycasting for hover or interaction against scene objects, or adding a global in-lens leaderboard — differentiates from spectacles-lens-essentials (physics/SIK) and from spectacles-cloud (Supabase persistence).
8spectacles-cloud
Reference guide for Snap Cloud (Supabase-powered backend) in Spectacles lenses — covering Fetch API setup (requires Internet Access capability in Project Settings), Postgres REST queries with the anon key, Row Level Security policies, Realtime WebSocket subscriptions with correct postgres_changes event format and reconnect-on-sleep patterns, cloud storage uploads of base64 images captured by Spectacles, serverless Edge Functions, and companion web dashboard architecture. Use this skill whenever a lens needs persistent cloud data, needs to share data with a web app in real time, uploads captured images to a bucket, or calls a cloud function — covering Snap Cloud and World Kindness Day samples. Use spectacles-networking for plain REST calls to non-Snap backends, and spectacles-connected-lenses for in-session multiplayer state.
7lens-studio-materials-shaders
Reference guide for materials and shaders in Lens Studio — covering runtime material property changes (clone-before-modify, mainPass.baseColor, mainPass.opacity, mainPass.baseTex), blend modes (Normal/Alpha/Add/Screen/Multiply), depth and cull settings (depthTest, depthWrite, twoSided, cullMode), render order, material variants, assigning textures and render targets, reading and writing RenderTarget textures for post-processing, the graph-based Material Editor node system, custom shader graph nodes, and common shader pitfalls. Use this skill for any lens that needs to change material colors or textures at runtime, implement custom visual effects with shaders, set up post-processing render pipelines, chain render targets, or debug material/blend-mode issues — covering MaterialEditor, Drawing, and HairSimulation examples.
6spectacles-connected-lenses
Reference guide for real-time multiplayer AR on Spectacles using Connected Lenses and Spectacles Sync Kit — covering session creation/joining with joinOrCreateSession (including 'already-in-session' error handling), TransformSyncComponent for position/rotation replication, RealtimeStore for shared key-value state (max 512 bytes per key), NetworkEventSystem for one-shot broadcast events, EntityOwnership for physics authority, Lens Cloud for persistent cross-session data, and patterns for turn-based (Tic Tac Toe) and real-time physics (Air Hockey). Also covers late-joiner state sync, transform drift mitigation, and store size limits. Use this skill whenever multiple Spectacles users need to share AR objects or state — covering Tic Tac Toe, Air Hockey, Laser Pointer, High Five, Shared Sync Controls, Spectacles Sync Kit, and Think Out Loud samples.
5