A new tool lets artists add invisible changes to the pixels in their art before they upload it online so that if it’s scraped into an AI training set, it can cause the resulting model to break in chaotic and unpredictable ways.

The tool, called Nightshade, is intended as a way to fight back against AI companies that use artists’ work to train their models without the creator’s permission.
[…]
Zhao’s team also developed Glaze, a tool that allows artists to “mask” their own personal style to prevent it from being scraped by AI companies. It works in a similar way to Nightshade: by changing the pixels of images in subtle ways that are invisible to the human eye but manipulate machine-learning models to interpret the image as something different from what it actually shows.

  • Meowoem@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    It doesn’t even need a work around, it’s not going to affect anything when training a model.

    It might make style transfer harder using them as reference images on some models but even that’s fairly doubtful, it’s just noise on an image and everything is already full of all sorts of different types of noise.