LoopMaker Web
A browser-based AI music generation tool powered by ACE-Step, ported to Linux for local generation on AMD Strix Halo hardware.
I saw a macOS app called LoopMaker that uses ACE-Step v1.5 to generate music loops and thought it was a great idea — but I’m on Linux. So I built a web-based version as a browser frontend talking to a FastAPI backend, using the same model.
What It Does
You describe a track in natural language — genre, mood, instruments, tempo, key — and ACE-Step generates music from that description. The UI has genre presets, a lyrics editor, and controls for duration, guidance scale, and quality mode (draft/fast/quality tradeoffs on inference steps). There’s also cover and extend modes for working with existing audio: generate a cover of a reference track or extend/repaint a section.
Generated tracks go into a library with metadata, playback, favorites, and the ability to trace lineage — which tracks were derived from which. The whole thing runs over WebSocket so you get real-time progress during generation.
The Stack
The backend is a single FastAPI service that handles model downloading, generation, track management, and audio serving. The frontend is plain HTML/CSS/JS — no build step, no framework. Just a browser talking to localhost.
Getting ACE-Step running on ROCm was its own adventure. PyTorch with ROCm 7.0 wheels works on the Strix Halo hardware, but the model currently segfaults during init when using GPU mode. So for now it defaults to CPU inference (LOOPMAKER_FORCE_CPU=1), which is slower but stable. GPU mode is there as an opt-in for when the ROCm situation improves.