Ironically AI doesn’t have perfectly recall either, and that’s kind of one of the main problems with it and hallucinations. It can easily get poisoned by a handful of data points in it’s training set. But even then, it can only really blend 2 data points together, it’s got no ability to extrapolate and think outside the box.
Ironically AI doesn’t have perfectly recall either, and that’s kind of one of the main problems with it and hallucinations. It can easily get poisoned by a handful of data points in it’s training set. But even then, it can only really blend 2 data points together, it’s got no ability to extrapolate and think outside the box.