• pulsewidth@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    6 hours ago

    Correct spelling is the fundamental component of words, without words there is no vocabulary. Without understanding words, LLMs have absolutely no understanding of vocabulary. They can certainly spew out things they’ve tokenized and weighted from ingested inputs though - like when people trick it into believing false definitions through simply repeating them as correct and thereby manipulating (or poisoning) the weighting. ChatGPT and other LLMs regularly fail to interpret common parts of vocabulary - eg idioms, word spellings, action-reaction consequences in a sentence. They’re fancy autocomplete, filled with stolen (and occasionally licensed) data.

    Sure seems like the problem isn’t me or the other guy ‘dont know how to use LLMs’, but rather that they keep getting sold as something they’re not.

    Congrats though, you just used a 100 billion dollar machine array to more or less output the exact content of a Wikipedia article - you really proved your point that it’s very good when you know what to ask it, and us plebs are just dumb at questions, or something 👍 https://en.wikipedia.org/wiki/Platitude