If an AI has passed the Turing test, meaning people cannot tell the >difference between a human being and a computer by talking to it, then
how do you know that the AI is not conscious? Look at it the other way,
a human being failed the Turing test. So human beings are just a load of >neurons firing, it's just a trick.
"GPT-4.5 could fool people into thinking it was another human 73% of the >time. "..." And 4.5 was even judged to be human significantly *more*
often than actual humans!"
Interestingly, the use of *asterisks* has slipped into an HTML article
where there is no need for them.
https://www.livescience.com/technology/artificial-intelligence/open-ai-gpt-4-5-is-the-first-ai-model-to-pass-an-authentic-turing-test-scientists-say
| Sysop: | DaiTengu |
|---|---|
| Location: | Appleton, WI |
| Users: | 1,075 |
| Nodes: | 10 (0 / 10) |
| Uptime: | 90:34:00 |
| Calls: | 13,798 |
| Calls today: | 1 |
| Files: | 186,989 |
| D/L today: |
5,324 files (1,535M bytes) |
| Messages: | 2,438,211 |