Discussion about this post

User's avatar
Alan's avatar

More than a Rube Goldberg machines, LLMs are "humans all the way down", meaning there are always humans in the loop somewhere

Expand full comment
Craig's avatar

I don't even like using the word "hallucination."

It seems wrong to anthropomorphize computer errors.

I also like saying "regurgitative" rather than "generative."

Expand full comment
4 more comments...

No posts