From olcott@polcott333@gmail.com to comp.theory,comp.lang.c++,comp.lang.c,comp.ai.philosophy on Wed Oct 1 08:40:46 2025
From Newsgroup: comp.ai.philosophy
The way that we can trust the reliability of
LLM systems and thus get rid of AI hallucination
is to require them to cite their external sources.
--
Copyright 2025 Olcott "Talent hits a target no one else can hit; Genius
hits a target no one else can see." Arthur Schopenhauer
--- Synchronet 3.21a-Linux NewsLink 1.2
Who's Online
Recent Visitors
Microbot
Thu Oct 23 01:54:50 2025
from
Moore, Ok
via
Telnet
Microbot
Wed Oct 22 03:04:13 2025
from
Moore, Ok
via
Telnet
Microbot
Tue Oct 21 04:06:41 2025
from
Moore, Ok
via
Telnet
Microbot
Mon Oct 20 05:50:15 2025
from
Moore, Ok
via
Telnet