If an AI has passed the Turing test, meaning people cannot tell the >difference between a human being and a computer by talking to it, then
how do you know that the AI is not conscious? Look at it the other way,
a human being failed the Turing test. So human beings are just a load of >neurons firing, it's just a trick.
"GPT-4.5 could fool people into thinking it was another human 73% of the >time. "..." And 4.5 was even judged to be human significantly *more*
often than actual humans!"
Interestingly, the use of *asterisks* has slipped into an HTML article
where there is no need for them.
https://www.livescience.com/technology/artificial-intelligence/open-ai-gpt-4-5-is-the-first-ai-model-to-pass-an-authentic-turing-test-scientists-say
Sysop: | DaiTengu |
---|---|
Location: | Appleton, WI |
Users: | 1,073 |
Nodes: | 10 (0 / 10) |
Uptime: | 219:44:06 |
Calls: | 13,783 |
Calls today: | 1 |
Files: | 186,987 |
D/L today: |
579 files (182M bytes) |
Messages: | 2,434,774 |