Would you like a spicy spaghetti dish? Just use some gasoline.
It’s almost like LLM’s aren’t the solution to literally everything like companies keep trying to tell us they are. Weird.
I honestly can’t wait for this to blow up in a company’s face in a very catastrophic way.
Already has - air Canada was held liable for their ai chatbot giving wrong information that a guy used to buy bereavement tickets. They tried to claim they weren’t responsible for what it said, but the judge found otherwise. They had to pay damages.
That’s not catastrophic yet. That cost them only the money which would otherwise have been margin on top of a low priced ticket.
It’s not going to be long before this seriously injures or kills someone.
Yeah. My mother is getting phishing emails and genuinely believes that Nancy Pelosi is sending her emails asking for monetary support. We’re not even American. Like, not even the same continent.
Not everyone is as critical as they ought to be when reading stuff on the internet. It doesn’t help that LLMs have a tendency to state things confidently or matter-of-factly.
People not familiar with the tech will read it and take it at face value, ignoring the “this is AI generated and might be wrong” because that sounds too technological to some people that their brain doesn’t even process it.
This is such a disinfo nightmare, imagine if it was trained (prompting would be easier actually) to spread high quality data with strategically planted lies to maximize harmful confident incorrectness.