<
https://davidgerard.co.uk/blockchain/2024/04/11/pivot-to-ai-hallucinations-worsen-as-the-money-runs-out/>
"A vision came to us in a dream — and certainly not from any nameable person —
on the current state of the venture capital-fueled AI and machine learning
industry. We asked around and several others who work in the field concurred
with this assessment.
Generative AI is famous for “hallucinating” made-up answers with wrong facts.
These are crippling to the credibility of AI-driven products.
The bad news is that the hallucinations are not decreasing. In fact, the
hallucinations are getting worse.
Large language models work by generating output based on what tokens
statistically follow from other tokens. They are extremely capable
autocompletes.
All output from a LLM is a “hallucination” — generated from the latent space
between the training data. LLMs are machines for generating convincing-sounding
nonsense — “facts” are not a type of data in LLMs.
But if your input contains mostly facts, then the output has a better chance of
not being just nonsense.
Unfortunately, the venture-capital-funded AI industry runs on the promise of
replacing humans with a very large shell script — including in areas where
details matter. If the AI’s output is just plausible nonsense, that’s a
problem. So the hallucination issue is causing a slight panic among AI company
leadership."
Cheers,
*** Xanni ***
--
mailto:xanni@xanadu.net Andrew Pam
http://xanadu.com.au/ Chief Scientist, Xanadu
https://glasswings.com.au/ Partner, Glass Wings
https://sericyb.com.au/ Manager, Serious Cybernetics