visionos-immersive-media-developer
visionOS Immersive Media Developer
Quick Start
Decide first whether the app should use the system AVKit experience or a custom RealityKit playback surface.
- Clarify the media shape: surface video, portal, progressive immersive, full immersive, spatial video, or Apple Immersive Video.
- Load only the matching reference files.
- Treat playback-mode changes as both media and scene-orchestration work.
Load References When
| Reference | When to Use |
|---|---|
playback-decision-tree.md |
Decision tree: window vs portal vs progressive vs full immersive playback. |
avexperiencecontroller.md |
When AVKit AVExperienceController is the right surface for the product. |
videoplayercomponent-basics.md |
When you need to set up VideoPlayerComponent + AVPlayer correctly. |
apmp-and-spatial-video.md |
When the content is spatial video, APMP, or Apple Immersive Video. |
immersive-viewing-modes.md |
When implementing portal, progressive, or full modes and related scene transitions. |
events-and-transitions.md |
When responding to VideoPlayerEvents and managing UI during transitions. |
comfort-mitigation.md |
When handling comfort violations and mitigation strategies on visionOS 26+. |
apple-immersive-video-authoring.md |
When you need Apple Immersive Video authoring or packaging references. |
Workflow
- Choose the playback architecture.
- Load the references for that surface and media type.
- Implement playback and viewing-mode transitions.
- Add event handling and comfort mitigation where relevant.
- Summarize the playback path, transition model, and remaining validation work.
Guardrails
- Start with AVKit
AVExperienceControllerwhen the system player experience fits the product. - Use RealityKit
VideoPlayerComponentwhen the video must live in a custom scene graph. - Do not treat immersive-mode transitions as a simple property flip when the app also needs scene changes.
- Make sure the user has a clear exit path from immersive playback.
Output Expectations
Provide:
- the chosen playback architecture
- the media type and viewing mode
- which references were used
- the transition or event model involved
- the next validation step
More from tomkrikorian/visionosagents
realitykit-visionos-developer
Build, debug, and optimize RealityKit scenes for visionOS 26, including entity/component setup, rendering, animation, physics, audio, input, attachments, and custom systems. Use when implementing RealityKit features or troubleshooting ECS behavior on visionOS.
62arkit-visionos-developer
Build and debug ARKit features for visionOS 26, including ARKitSession setup, authorization, data providers (world tracking, plane detection, scene reconstruction, hand tracking), anchor processing, and RealityKit integration. Use when implementing ARKit workflows on visionOS or troubleshooting provider-specific space, privacy, and lifecycle behavior.
55coding-standards-enforcer
Enforce repository coding standards for Swift 6.2 strict concurrency, actor isolation, @Observable models, SWIFT_APPROACHABLE_CONCURRENCY, @concurrent functions, and modern Swift APIs across visionOS app code. Use when reviewing, writing, or migrating Swift code in this plugin's scope.
44usd-editor
Guide for modifying USD ASCII (.usda) files, including prims, properties, composition arcs, variants, and transforms. Use when editing or reviewing .usda files by hand.
32shadergraph-editor
Author, load, and troubleshoot Reality Composer Pro Shader Graph materials for RealityKit on visionOS. Use when building Shader Graph materials, exposing promoted inputs for runtime control, or debugging exported USD and MaterialX interop.
27tkr-skill-writer
Guide for creating and structuring skills with consistent formatting, clear documentation, and proper reference organization. Use when creating new skills or updating existing skill documentation.
14