I plugged my local AI into offline wikipedia expecting a source of truth to make it way way better.
It’s better, but I also can’t tell when it’s making up citations now, because it uses Wikipedia to support its own world view from pre training instead of reality.
So it’s not really much better.
Hallucinations become a bigger problem the more info they have (that you now have to double check)
At my work, we don’t allow it to make citations. We instruct it to add in placeholders for citations instead, which allows us to hunt down the info, ensure it’s good info, and then add it in ourselves.
In some instances that’s sufficient though, depending on how much precision you need for what you do. Regardless, you have to review it no matter what it produces.
I plugged my local AI into offline wikipedia expecting a source of truth to make it way way better.
It’s better, but I also can’t tell when it’s making up citations now, because it uses Wikipedia to support its own world view from pre training instead of reality.
So it’s not really much better.
Hallucinations become a bigger problem the more info they have (that you now have to double check)
At my work, we don’t allow it to make citations. We instruct it to add in placeholders for citations instead, which allows us to hunt down the info, ensure it’s good info, and then add it in ourselves.
That’s still looking for sources that fit a predetermined conclusion, not real research
Yup.
In some instances that’s sufficient though, depending on how much precision you need for what you do. Regardless, you have to review it no matter what it produces.
That probably makes sense.
I haven’t played around since the initial shell shock of “oh god it’s worse now”