UK music industry should adopt five key principles for regulating AI use, industry body says
UK music industry should adopt five key principles for regulating AI use, industry body says

品質はまだ判断しがたいながらStable Audioなるものも登場して、そろそろ何か言っとくか的な局面に差し掛かった感はあります。

Stable Audio represents the cutting-edge audio generation research by Stability AI’s generative audio research lab, Harmonai. We continue to improve our model architectures, datasets, and training procedures to improve output quality, controllability, inference speed, and output length.



  1. Where licensing deals are negotiated in respect of AI technologies, the explicit consent of individual music-makers must be secured before music is used to train AI models. Such consent cannot be inferred by rights-holders or technology companies.
  2. The publicity, personality and personal data rights of music-makers must be respected. These rights belong to individual music-makers and cannot be exploited – by AI companies or rights-holders – without explicit consent. The UK government should clarify and strengthen these rights, and collaborate internationally to promote a robust global rights regime.
  3. Where permission is granted, music-makers must share fairly in the financial rewards of music AI, including from music generated by AI models trained on their work.
  4. As AI companies and rights-holders develop licensing models, they must proactively consult music-makers and reach agreement on how each stakeholder will share in the revenue from AI products and services.
  5. AI-generated works must be clearly labelled as such and AI companies must be fully transparent about the music that has been used to train their models, keeping and making available complete records of datasets. Rights-holders must be transparent about all licensing deals that have been negotiated with AI companies and what works those deals include.
UK music-makers call for consent, respect and remuneration to be central to all music AI – Council of Music Makers