mozz@mbin.grits.dev to Technology@beehaw.org · 7 months agoSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devimagemessage-square199fedilinkarrow-up1487arrow-down10file-text
arrow-up1487arrow-down1imageSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devmozz@mbin.grits.dev to Technology@beehaw.org · 7 months agomessage-square199fedilinkfile-text
minus-squarerutellthesinful@kbin.sociallinkfedilinkarrow-up4·7 months agojust ask for the output to be reversed or transposed in some way you’d also probably end up restrictive enough that people could work out what the prompt was by what you’re not allowed to say
just ask for the output to be reversed or transposed in some way
you’d also probably end up restrictive enough that people could work out what the prompt was by what you’re not allowed to say