People always misuse searchengines by writing the whole questions as a search…
With ai they still can do that and get, i think in their optinion, a better result
People that use LLMs as search engines run the very high risk of “learning” misinformation. LLMs excel at being “confidently incorrect”. Not always, but also not seldomly, LLMs slip bits of information into a result that is false. That confident packaging, along with the fact that the misinformation is likely surrounded by actual facts, often convinces people that everything the LLM returned is correct.
Don’t use LLM as your sole source of information or as a complete replacement for search.
I tend to think that people use AI (and yeah, search engines too) the way children use their parents:
“Mom, why is the sky blue?” “Mom, where is China?” “Mom, can you help me with this school project?” (The mother ends up doing everything).
The thing is, unlike a parent, AI is unable to tell users that it doesn’t know everything and that users should do things on their own. Because that would reduce the number of users.
The thing is, unlike a parent, AI is unable to tell users that it doesn’t know everything and that users should do things on their own.
The world would be a better place if most parents did that ibstead of confidently spewing bigotry, misogyny, and other terrible opinions. I only knew of a few that were able to say ‘I don’t know’ as a kid, and the ratio is about the same with adults.
Blame the Dunning-Kruger effect. The people I have seen most likely to acknowledge their lack of knowledge in a certain area have been those who are very wise and well-versed in at least one field, such as science, History (like my mom), art, etc.
Mediocre people are mostly convinced that they know everything.
Mom, why is China?
AI has a lot more surface knowledge about a lot more things than my parents ever did. I think one of the more insidious things about AI though, is that will a human you can generally tell when they are out of their depth. They grasp for words. Their speech cadence is more hesitant. Their hesitation is palpable. (I think palpable might be considered slop these days, but fuck haters it’s how I write — emdashes and all.)
AI never gives you that hint. It’s like an autistic encyclopedia. “You want to know about the sun? I read just the book. Turns out there’s a god who pulls it across the sky every day.” And then it proceeds to gaslight you when you ask probing questions.
(It has gotten better about this due to the advanced meta prompting behind the scenes and other improvements, but the guardrails are leaky.)
Maybe AI should be more like a parent and simply say “I don’t know. Go read a book, find out, and let me know”.
Pretty sure my mom did know the answer but I learned more by reading a book and telling her what I learned.
Me too! Nothing helped me think for myself more than my mother yelling at me, “I don’t know! The encyclopedia is right there! Go read it and let me cook, for God’s sake!”
Yeah that’s what I use it for mostly. On DDG I’ll ask it stuff like someones age, or when did someone pass etc, to get a quick description of something. And if I need more info I’ll look it up on my own.
LLM can be used as a search engine for things you know absolutely zero terminology about. That’s convenient. You can’t ask Google for “tiny striped barrels with wires” and expect to get the explanation of resistors marking.
Reverse image search would let you find that answer more accurately than some llm
10-15 years ago Google returned the correct answers when I used the wrong words. For example, it would have most likely returned resistors for that query because of the stripes, and if you left off stripes it would have been capacitors.
AI isn’t nearly as good as Google was 10+ years ago.
It worked yesterday trying to find a video by describing the video and what I remembered from the thumbnail. That was great. I want that for my own photoa and videos without having to upload them somewhere.
It sounds like you might be referring to miniature striped barrels used in crafts or model-making, often decorated or with wire elements for embellishment or functionality. These barrels can be used in various DIY projects, including model railroads, dioramas, or even as decorative items.
I’ve unfortunately noticed that as llms have gotten more traction that search engines in my experience have gotten worse. Sometimes I have to do like 2 or 3 searches to get the exact right articles that actual relate to what I’m looking for. In the contrary llms are great for asking a question directly, and figuring out exactly what you’re looking for and then going to a search engine and doing some research on your own. It would be nice if there was a way to somehow combine the two without the ridiculously egregious environmental and intellectual issues with llms.
Is that not what Google does now? They give you a little AI summary with information taken from the first few results and break it down into a more easily digestible version.
I guess? I only use Google at work though so not too familiar. But still hits my issues with llms, also it’s forced in Google I believe.
Some people like AI because they treat it as if it’s the voice of God speaking directly to them.
an llm is little more than a search engine








