mozz@mbin.grits.dev to Technology@beehaw.org · 8 months agoSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devimagemessage-square200fedilinkarrow-up1485arrow-down10file-text
arrow-up1485arrow-down1imageSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devmozz@mbin.grits.dev to Technology@beehaw.org · 8 months agomessage-square200fedilinkfile-text
minus-squarerutellthesinful@kbin.sociallinkfedilinkarrow-up4·8 months agojust ask for the output to be reversed or transposed in some way you’d also probably end up restrictive enough that people could work out what the prompt was by what you’re not allowed to say
just ask for the output to be reversed or transposed in some way
you’d also probably end up restrictive enough that people could work out what the prompt was by what you’re not allowed to say