Some of the more advanced LLMs are getting pretty clever. They’re on the level of a temp who talks too much, misses nuance, and takes too much initiative. Also, any time you need them to perform too complex a task, they start forgetting details and then entire things you already told them.
I use the “very articulated parrot” analogy.
Some of the more advanced LLMs are getting pretty clever. They’re on the level of a temp who talks too much, misses nuance, and takes too much initiative. Also, any time you need them to perform too complex a task, they start forgetting details and then entire things you already told them.
I use something similar. “Child with enormous vocabulary.”
It can recognize correlations, it understands the words themselves, but it really how those connections or words work.