tfjs-skill
SKILL.md
TensorFlow.js Best Practices
Comprehensive best practices guide for TensorFlow.js applications, designed for AI agents and LLMs. Contains 30+ rules across 7 categories, prioritized by impact to guide code generation, review, and refactoring.
When to Apply
Reference these guidelines when:
- Writing new TensorFlow.js code (browser, Node.js, or React Native)
- Creating or loading ML models with TF.js APIs
- Implementing data preprocessing pipelines with tf.data or tf.tensor
- Optimizing inference or training performance
- Running .tflite models in the browser with LiteRT.js
- Debugging memory leaks, slow inference, or numerical issues
- Choosing between WebGL, WASM, WebGPU, or Node.js backends
Rule Categories by Priority
| Priority | Category | Impact | Prefix |
|---|---|---|---|
| 1 | Memory Management | CRITICAL | memory- |
| 2 | Data Loading & Preprocessing | CRITICAL | data- |
| 3 | Model Creation & Architecture | HIGH | model- |
| 4 | Training Optimization | HIGH | training- |
| 5 | Inference Performance | HIGH | inference- |
| 6 | TFLite Inference | HIGH | tflite- |
| 7 | Backend Selection & Configuration | MEDIUM | backend- |
Quick Reference
1. Memory Management (CRITICAL)
memory-tidy-intermediate- Wrap tensor operations in tf.tidy() to auto-dispose intermediatesmemory-no-async-tidy- Never pass async functions to tf.tidy()memory-dispose-outputs- Always dispose model outputs after extracting datamemory-monitor-leaks- Use tf.memory() as a leak sentinel in productionmemory-variables-manual- Variables require explicit disposal, tf.tidy() won't clean themmemory-keep-owned-tensors- Use tf.keep() for tensors that must outlive tidy scopememory-dispose-variablegrads- Dispose tf.variableGrads() results after applyGradientsmemory-dispose-models- Dispose models when replacing or unmounting (SPA/page-unload)
2. Data Loading & Preprocessing (CRITICAL)
data-async-readback- Use tensor.data() instead of dataSync() in UI contextsdata-normalize-inputs- Normalize input data consistently before model operationsdata-streaming-large- Use tf.data API for streaming large datasetsdata-image-preprocessing- Standard image preprocessing pipeline with tf.browser.fromPixelsdata-shuffle-seed- Use fixed seeds for reproducible data pipelines
3. Model Creation & Architecture (HIGH)
model-api-selection- Choose between Sequential, Functional, and Core APIsmodel-graph-vs-layers- Use GraphModel for inference-only, LayersModel for trainingmodel-warmup- Always warmup models before measuring or serving inferencemodel-save-storage- Choose the right storage scheme for model persistencemodel-loading-local-first-remote-fallback- Try local model first, then fallback to hosted
4. Training Optimization (HIGH)
training-callbacks-ui- Use async callbacks to keep UI responsive during trainingtraining-fit-dataset- Use model.fitDataset() for large datasetstraining-visualization- Use tfjs-vis for monitoring training metricstraining-fitdataset-infinite-stream- Use batchesPerEpoch with infinite datasetstraining-webworker-offload- Move long browser training jobs to Web Workers
5. Inference Performance (HIGH)
inference-prod-mode- Enable prod mode for production inferenceinference-graph-async- Use executeAsync() for models with control flow opsinference-profiling- Use tf.time() and tf.profile() for performance analysisinference-realtime-debounce- Debounce/cancel stale inference for user input events
6. TFLite Inference (HIGH)
tflite-native-api- Use LiteRT native API for standalone .tflite inferencetflite-tfjs-interop- Share WebGPU device between TFJS and LiteRTtflite-memory-delete- Always call .delete() on LiteRT tensorstflite-gpu-fallback- Detect WebGPU support and fallback to WASMtflite-tiling-large- Use tiling strategy for large image inputs
7. Backend Selection & Configuration (MEDIUM)
backend-explicit-init- Explicitly set backend with fallback chainbackend-wasm-threads- Configure COOP/COEP for WASM multi-threadingbackend-mobile-precision- Check float32 capability on mobile WebGL
How to Use
Read individual rule files for detailed explanations and code examples:
rules/memory-tidy-intermediate.md
rules/data-async-readback.md
Each rule file contains:
Each rule file contains:
- Brief explanation of why it matters
- Incorrect code example with explanation
- Correct code example with explanation
- Additional context and references
Full Compiled Document
For the complete guide with all rules expanded: AGENTS.md
Weekly Installs
1
Source
developer.tuya.…ill-betaFirst Seen
7 days ago
Installed on
amp1
cline1
opencode1
cursor1
kimi-cli1
codex1