spectacles-mocopi-integration
Mocopi Integration
Sony mocopi is a mobile motion capture system that can stream skeletal data over a local network. Using the MocopiReceiver package, you can drive AR avatars and interactive characters in Spectacles lenses using real-time full-body motion capture.
Official site: Sony mocopi
Connection Workflow
- Host Server: Run the mocopi desktop/mobile app and enable "Send" mode (UDP/WebSocket).
- WebSocket Client: Use
MocopiWebSocketClientin Lens Studio to connect to the IP/Port of the host. - Skeleton Mapping: Listen for the
SkeletonDefinition, which contains the hierarchical bone structure. - Frame Processing: Update your 3D avatar bones every time a
FrameDatapacket arrives.
Basic Implementation
The MocopiMainController manages the communication between the network client and the avatar.
import { MocopiWebSocketClient } from "./MocopiWebSocketClient";
import { MocopiAvatarController } from "./MocopiAvatarController";
@component
export class MyMocopiBridge extends BaseScriptComponent {
@input webSocketClient: MocopiWebSocketClient;
@input avatarController: MocopiAvatarController;
onAwake() {
this.webSocketClient.onSkeletonReceived = (skeleton) => {
// Initialize the avatar mesh with the correct bone count and hierarchy
this.avatarController.initializeAvatar(skeleton);
};
this.webSocketClient.onFrameReceived = (frame) => {
// Apply real-time rotations/positions to the avatar bones
this.avatarController.updateFrame(frame);
};
}
public connect() {
this.webSocketClient.connect();
}
}
Avatar Configuration
Your 3D model must be rigged with a hierarchy that matches the mocopi bone structure (hips, spine, neck, head, and limbs).
[!TIP] Use the
ResetButtonpattern in your lens to allow users to recalibrate the avatar's world position if it drifts from the tracking center.
Reference Example
The logic below is extracted from the official MocopiReceiver.lspkg package, demonstrating the main orchestration logic for real-time motion capture.
See the reference guide for details.
More from rolandsmeenk/lensstudioagents
lens-studio-scripting
Reference guide for the Lens Studio TypeScript component system — covering the @component, @input, @hint, @allowUndefined, and @label decorators, the BaseScriptComponent lifecycle (onAwake vs OnStartEvent, UpdateEvent, DelayedCallbackEvent one-shot and repeating timers, TurnOnEvent/TurnOffEvent, onDestroy), accessing components with getComponent (plus null-check patterns to fix 'cannot read property of null' errors), cross-TypeScript imports with getTypeName(), NativeLogger vs print, prefab instantiation (sync and async), SceneObject hierarchy queries, and enabling/disabling objects. Use this skill whenever writing or debugging any Lens Studio TypeScript script, wiring up scene objects, or fixing 'this is undefined' or null-reference errors — platform-agnostic (works for Spectacles and phone lenses).
12spectacles-lens-essentials
Reference guide for foundational Lens Studio patterns on Spectacles — covering the GestureModule (pinch down/up/strength, targeting, grab, phone-in-hand with correct TypeScript API), SIK components (PinchButton, DragInteractable, GrabInteractable, ScrollView), hand-tracking gestures, physics bodies/colliders/callbacks (including audio-on-collision), LSTween animation (position/scale/rotation/color tweens), prefab instantiation at runtime, materials (clone-before-modify), spatial anchors, on-device persistent storage (putString/getFloat), spatial images, and the Path Pioneer raycasting pattern. Use this skill for any Spectacles lens that needs interaction, motion, animation, physics, audio, or persistent local storage — including Essentials, Throw Lab, Spatial Persistence, Spatial Image Gallery, Path Pioneer, Public Speaker, Voice Playback, Material Library, and DJ Specs samples.
9lens-studio-world-query
Reference guide for world understanding and scoring in Lens Studio — covering WorldQueryModule HitTestSession (HitTestSessionOptions.filter for jitter smoothing, semantic surface classification for floor/wall/ceiling/table detection, null result handling, per-frame performance), SIK InteractionManager targeting interactor ray pattern, Physics.createGlobalProbe().rayCast for scene-collider hits with collision layer filtering, aligning objects to surface normals using quat.lookAt, and the LeaderboardModule (create/retrieve with TTL and OrderingType, submitScore, getLeaderboardInfo with UsersType.Global/Friends). Use this skill when detecting real floors/walls/tables to place AR content, raycasting for hover or interaction against scene objects, or adding a global in-lens leaderboard — differentiates from spectacles-lens-essentials (physics/SIK) and from spectacles-cloud (Supabase persistence).
8spectacles-cloud
Reference guide for Snap Cloud (Supabase-powered backend) in Spectacles lenses — covering Fetch API setup (requires Internet Access capability in Project Settings), Postgres REST queries with the anon key, Row Level Security policies, Realtime WebSocket subscriptions with correct postgres_changes event format and reconnect-on-sleep patterns, cloud storage uploads of base64 images captured by Spectacles, serverless Edge Functions, and companion web dashboard architecture. Use this skill whenever a lens needs persistent cloud data, needs to share data with a web app in real time, uploads captured images to a bucket, or calls a cloud function — covering Snap Cloud and World Kindness Day samples. Use spectacles-networking for plain REST calls to non-Snap backends, and spectacles-connected-lenses for in-session multiplayer state.
7spectacles-networking
Reference guide for the Lens Studio Fetch API and WebView component in Spectacles lenses — covering InternetModule (Lens Studio 5.9+), Fetch API via internetModule.fetch(Request) with bytes/text/json response handling, performHttpRequest, Internet Access capability, GET/POST requests, custom headers, Bearer auth, polling, timeouts, CORS/HTTPS, WebSocket and RemoteMediaModule for media from URLs, and bidirectional WebView messaging. Use this skill for any lens that calls a REST API, polls a JSON endpoint, loads remote images, embeds a webpage, or talks to a custom backend — including the Fetch sample. Use spectacles-ai for LLM/RSG calls, or spectacles-cloud for Supabase/Snap Cloud integration.
6spectacles-connected-lenses
Reference guide for real-time multiplayer AR on Spectacles using Connected Lenses and Spectacles Sync Kit — covering session creation/joining with joinOrCreateSession (including 'already-in-session' error handling), TransformSyncComponent for position/rotation replication, RealtimeStore for shared key-value state (max 512 bytes per key), NetworkEventSystem for one-shot broadcast events, EntityOwnership for physics authority, Lens Cloud for persistent cross-session data, and patterns for turn-based (Tic Tac Toe) and real-time physics (Air Hockey). Also covers late-joiner state sync, transform drift mitigation, and store size limits. Use this skill whenever multiple Spectacles users need to share AR objects or state — covering Tic Tac Toe, Air Hockey, Laser Pointer, High Five, Shared Sync Controls, Spectacles Sync Kit, and Think Out Loud samples.
5