It’s been a big week – Music Radar and Music Ally both covered Poison Pill and the mission behind it: protecting independent musicians from unlicensed AI training.
As I told Music Ally, “Musicians’ and rights holders’ content has been scraped relentlessly by generative AI companies, using their music without permission to train and profit from models sold as replacements for working musicians.” That’s the problem we’re tackling head-on.
Poison Pill embeds imperceptible “adversarial noise” into music files; sound that humans can’t hear but which confuses AI models trying to learn from the track. The goal isn’t to break AI, it’s to bring companies to the table for fair licensing.
We’re already in conversations with labels and rights holders who see this as a constructive, technical way to assert ownership and consent in the age of generative AI.
The technology is just the start; the bigger shift is cultural. Artists are realising they don’t have to sit back while their work is repurposed without permission.
That’s why we built Poison Pill. To give musicians a choice, and a voice, in how their music is used.
Visit the site to see it for yourself at poisonpill.ai
