LLMs vs. Search Engines With Little or No ResultsPermalink

Jim Nielsen:

With a search engine, fewer quality results means something. But an LLM is going to spit back a response regardless of the quality of training data. When the data is thin, a search engine will give you nothing. An LLM will give you an inaccurate something.

Rather than saying “I don’t know, there’s not enough on this subject to formulate a working answer” — which is what you could infer from an empty search results page — an LLM will give you something that looks right. Then you have to go shoot yourself in the foot to learn it’s not right, because you didn’t know enough to know it was wrong.

LLMs just making stuff up is one of their main issues. I have to imagine there is a lot of work going into improving this situation, the end result of which is surely use-cases where the response is simply “I don’t know”. Until then you have to take every answer-style response with a healthy amount of scepticism.