remotion-render
Remotion Render
Render videos from React/Remotion component code via inference.sh CLI.

Quick Start
Requires inference.sh CLI (
infsh). Install instructions
infsh login
# Render a simple animation
infsh app run infsh/remotion-render --input '{
"code": "import { useCurrentFrame, AbsoluteFill } from \"remotion\"; export default function Main() { const frame = useCurrentFrame(); return <AbsoluteFill style={{backgroundColor: \"#000\", display: \"flex\", justifyContent: \"center\", alignItems: \"center\"}}><h1 style={{color: \"white\", fontSize: 100, opacity: frame / 30}}>Hello World</h1></AbsoluteFill>; }",
"duration_seconds": 3,
"fps": 30,
"width": 1920,
"height": 1080
}'
Input Schema
| Parameter | Type | Required | Description |
|---|---|---|---|
code |
string | Yes | React component TSX code. Must export default a component. |
composition_id |
string | No | Composition ID to render |
props |
object | No | Props passed to the component |
width |
number | No | Video width in pixels |
height |
number | No | Video height in pixels |
fps |
number | No | Frames per second |
duration_seconds |
number | No | Video duration in seconds |
codec |
string | No | Output codec |
Available Imports
Your TSX code can import from remotion and react:
// Remotion APIs
import {
useCurrentFrame,
useVideoConfig,
spring,
interpolate,
AbsoluteFill,
Sequence,
Audio,
Video,
Img
} from "remotion";
// React
import React, { useState, useEffect } from "react";
Examples
Fade-In Text
infsh app run infsh/remotion-render --input '{
"code": "import { useCurrentFrame, AbsoluteFill, interpolate } from \"remotion\"; export default function Main() { const frame = useCurrentFrame(); const opacity = interpolate(frame, [0, 30], [0, 1]); return <AbsoluteFill style={{backgroundColor: \"#1a1a2e\", display: \"flex\", justifyContent: \"center\", alignItems: \"center\"}}><h1 style={{color: \"#eee\", fontSize: 80, opacity}}>Welcome</h1></AbsoluteFill>; }",
"duration_seconds": 2,
"fps": 30,
"width": 1920,
"height": 1080
}'
Animated Counter
infsh app run infsh/remotion-render --input '{
"code": "import { useCurrentFrame, useVideoConfig, AbsoluteFill } from \"remotion\"; export default function Main() { const frame = useCurrentFrame(); const { fps, durationInFrames } = useVideoConfig(); const progress = Math.floor((frame / durationInFrames) * 100); return <AbsoluteFill style={{backgroundColor: \"#000\", display: \"flex\", justifyContent: \"center\", alignItems: \"center\", flexDirection: \"column\"}}><h1 style={{color: \"#fff\", fontSize: 200}}>{progress}%</h1><p style={{color: \"#666\", fontSize: 30}}>Loading...</p></AbsoluteFill>; }",
"duration_seconds": 5,
"fps": 60,
"width": 1080,
"height": 1080
}'
Spring Animation
infsh app run infsh/remotion-render --input '{
"code": "import { useCurrentFrame, useVideoConfig, spring, AbsoluteFill } from \"remotion\"; export default function Main() { const frame = useCurrentFrame(); const { fps } = useVideoConfig(); const scale = spring({ frame, fps, config: { damping: 10, stiffness: 100 } }); return <AbsoluteFill style={{backgroundColor: \"#6366f1\", display: \"flex\", justifyContent: \"center\", alignItems: \"center\"}}><div style={{width: 200, height: 200, backgroundColor: \"white\", borderRadius: 20, transform: `scale(${scale})`}} /></AbsoluteFill>; }",
"duration_seconds": 2,
"fps": 60,
"width": 1080,
"height": 1080
}'
With Props
infsh app run infsh/remotion-render --input '{
"code": "import { AbsoluteFill } from \"remotion\"; export default function Main({ title, subtitle }) { return <AbsoluteFill style={{backgroundColor: \"#000\", display: \"flex\", justifyContent: \"center\", alignItems: \"center\", flexDirection: \"column\"}}><h1 style={{color: \"#fff\", fontSize: 80}}>{title}</h1><p style={{color: \"#888\", fontSize: 40}}>{subtitle}</p></AbsoluteFill>; }",
"props": {"title": "My Video", "subtitle": "Created with Remotion"},
"duration_seconds": 3,
"fps": 30,
"width": 1920,
"height": 1080
}'
Sequence Animation
infsh app run infsh/remotion-render --input '{
"code": "import { useCurrentFrame, AbsoluteFill, Sequence, interpolate } from \"remotion\"; function FadeIn({ children }) { const frame = useCurrentFrame(); const opacity = interpolate(frame, [0, 20], [0, 1]); return <div style={{ opacity }}>{children}</div>; } export default function Main() { return <AbsoluteFill style={{backgroundColor: \"#000\", display: \"flex\", justifyContent: \"center\", alignItems: \"center\", flexDirection: \"column\", gap: 20}}><Sequence from={0}><FadeIn><h1 style={{color: \"#fff\", fontSize: 60}}>First</h1></FadeIn></Sequence><Sequence from={30}><FadeIn><h1 style={{color: \"#fff\", fontSize: 60}}>Second</h1></FadeIn></Sequence><Sequence from={60}><FadeIn><h1 style={{color: \"#fff\", fontSize: 60}}>Third</h1></FadeIn></Sequence></AbsoluteFill>; }",
"duration_seconds": 4,
"fps": 30,
"width": 1920,
"height": 1080
}'
Python SDK
from inferencesh import inference
client = inference()
result = client.run({
"app": "infsh/remotion-render",
"input": {
"code": """
import { useCurrentFrame, AbsoluteFill, interpolate } from "remotion";
export default function Main() {
const frame = useCurrentFrame();
const opacity = interpolate(frame, [0, 30], [0, 1]);
return (
<AbsoluteFill style={{
backgroundColor: "#1a1a2e",
display: "flex",
justifyContent: "center",
alignItems: "center"
}}>
<h1 style={{ color: "#eee", fontSize: 80, opacity }}>
Hello from Python
</h1>
</AbsoluteFill>
);
}
""",
"duration_seconds": 3,
"fps": 30,
"width": 1920,
"height": 1080
}
})
print(result["output"]["video"])
Streaming Progress
for update in client.run({
"app": "infsh/remotion-render",
"input": {
"code": "...",
"duration_seconds": 10
}
}, stream=True):
if update.get("progress"):
print(f"Rendering: {update['progress']}%")
if update.get("output"):
print(f"Video: {update['output']['video']}")
Related Skills
# Remotion best practices (component patterns)
npx skills add remotion-dev/skills@remotion-best-practices
# AI video generation (for AI-generated clips)
npx skills add inference-sh/skills@ai-video-generation
# Image generation (for video assets)
npx skills add inference-sh/skills@ai-image-generation
# Python SDK reference
npx skills add inference-sh/skills@python-sdk
# Full platform skill
npx skills add inference-sh/skills@infsh-cli
Documentation
- Remotion Documentation - Official Remotion docs
- Running Apps - How to run apps via CLI
- Streaming Results - Real-time progress updates
More from inference-sh/agent-skills-registry
agent-browser
Browser automation for AI agents via inference.sh. Navigate web pages, interact with elements using @e refs, take screenshots, record video. Capabilities: web scraping, form filling, clicking, typing, drag-drop, file upload, JavaScript execution. Use for: web automation, data extraction, testing, agent browsing, research. Triggers: browser, web automation, scrape, navigate, click, fill form, screenshot, browse web, playwright, headless browser, web agent, surf internet, record video
19agent-tools
Run 150+ AI apps via inference.sh CLI - image generation, video creation, LLMs, search, 3D, Twitter automation. Models: FLUX, Veo, Gemini, Grok, Claude, Seedance, OmniHuman, Tavily, Exa, OpenRouter, and many more. Use when running AI apps, generating images/videos, calling LLMs, web search, or automating Twitter. Triggers: inference.sh, infsh, ai model, run ai, serverless ai, ai api, flux, veo, claude api, image generation, video generation, openrouter, tavily, exa search, twitter api, grok
19javascript-sdk
JavaScript/TypeScript SDK for inference.sh - run AI apps, build agents, integrate 150+ models. Package: @inferencesh/sdk (npm install). Full TypeScript support, streaming, file uploads. Build agents with template or ad-hoc patterns, tool builder API, skills, human approval. Use for: JavaScript integration, TypeScript, Node.js, React, Next.js, frontend apps. Triggers: javascript sdk, typescript sdk, npm install, node.js api, js client, react ai, next.js ai, frontend sdk, @inferencesh/sdk, typescript agent, browser sdk, js integration
15python-sdk
Python SDK for inference.sh - run AI apps, build agents, and integrate with 150+ models. Package: inferencesh (pip install inferencesh). Supports sync/async, streaming, file uploads. Build agents with template or ad-hoc patterns, tool builder API, skills, and human approval. Use for: Python integration, AI apps, agent development, RAG pipelines, automation. Triggers: python sdk, inferencesh, pip install, python api, python client, async inference, python agent, tool builder python, programmatic ai, python integration, sdk python
15ai-video-generation
Generate AI videos with Google Veo, Seedance, Wan, Grok and 40+ models via inference.sh CLI. Models: Veo 3.1, Veo 3, Seedance 1.5 Pro, Wan 2.5, Grok Imagine Video, OmniHuman, Fabric, HunyuanVideo. Capabilities: text-to-video, image-to-video, lipsync, avatar animation, video upscaling, foley sound. Use for: social media videos, marketing content, explainer videos, product demos, AI avatars. Triggers: video generation, ai video, text to video, image to video, veo, animate image, video from image, ai animation, video generator, generate video, t2v, i2v, ai video maker, create video with ai, runway alternative, pika alternative, sora alternative, kling alternative
8prompt-engineering
Master prompt engineering for AI models: LLMs, image generators, video models. Techniques: chain-of-thought, few-shot, system prompts, negative prompts. Models: Claude, GPT-4, Gemini, FLUX, Veo, Stable Diffusion prompting. Use for: better AI outputs, consistent results, complex tasks, optimization. Triggers: prompt engineering, how to prompt, better prompts, prompt tips, prompting guide, llm prompting, image prompt, ai prompting, prompt optimization, prompt template, prompt structure, effective prompts, prompt techniques
8