Local-first audio intelligence

Describe the sound.
Find the sound.

Substrate Audio builds tools that understand your sample library the way you do — through language, texture, and feel. No cloud. No subscription. Your sounds, your machine.

First product
Sift BETA
substrate sift — ~/samples (147,382 indexed)
01 — Search
Semantic audio search
Type what you hear in your head. CLAP embeddings map your words into the same space as your audio — matching by meaning, not metadata.
02 — Discover
Find more like this
Drop a sample in, get acoustically similar results back. Explore your library through sonic relationships you never knew existed.
03 — Analyse
Auto-tag, BPM, key
Every file gets analysed on import. Tempo detection, key estimation, instrument classification — all running locally on your hardware.
04 — Own
Fully local and offline
No cloud. No account. No data leaves your machine. Your sample library stays yours. One-time purchase, runs forever.
05 — See
Embedding space map
Visualise your entire library as a scatter plot in embedding space. See clusters form. Spot gaps. Understand the shape of your collection.
06 — Build
CLI and API access
Power users get a full command-line interface and local API. Script your workflows, integrate with your DAW, build on top of Sift.

Technical

  • CLAP neural embeddings (512-dim)
  • Cosine similarity search
  • WAV, AIFF, FLAC, OGG, MP3 support
  • Tauri + Rust desktop shell
  • Python inference backend
  • Local vector database
  • Libraries up to 500K+ files

Platform

  • macOS (Apple Silicon + Intel)
  • Windows 10/11
  • Linux (AppImage)
  • One-time purchase — no subscription
  • Offline — no internet required
  • No account or sign-up needed
  • CLI included

Sift is coming soon.

Join the list for early access and launch pricing.