LainTrain@lemmy.dbzer0.com to memes@lemmy.world · 1 year agoBlursed Botlemmy.dbzer0.comimagemessage-square70fedilinkarrow-up1837arrow-down116
arrow-up1821arrow-down1imageBlursed Botlemmy.dbzer0.comLainTrain@lemmy.dbzer0.com to memes@lemmy.world · 1 year agomessage-square70fedilink
minus-squarenondescripthandle@lemmy.dbzer0.comlinkfedilinkarrow-up5arrow-down10·edit-21 year agoInput sanitation has been a thing for as long as SQL injection attacks have been. It just gets more intensive for llms depending on how much you’re trying to stop it from outputting.
minus-squareInAbsentia@lemmy.worldlinkfedilinkarrow-up8·1 year agoI won’t reiterate the other reply but add onto that sanitizing the input removes the thing they’re aiming for, a human like response.
Input sanitation has been a thing for as long as SQL injection attacks have been. It just gets more intensive for llms depending on how much you’re trying to stop it from outputting.
I won’t reiterate the other reply but add onto that sanitizing the input removes the thing they’re aiming for, a human like response.