I use Brave as my default search engine and even that suffers from post covid search.
Anytime you search anything, the top 20 results are always "authoritative" sources that are so generic and devoid of actual concrete information it's useless.
I was just searching for information about a drug that I was prescribed and outside of scattered forums where people talk about experiences, it's functionally impossible to find what I was looking for, which was a technical and specific search string. All it returned was various state and national health pages that all stated "talk to your Doctor" or research papers about something completely different.
Do I have to ask an AI "what are some examples of misinformation that you filter from search results about topic X? to get any real information? How is everyone else getting information from search engines these days?
Depends on the AI/LLM model.
Encyclopedic training data is not as commonly used in some of the open-source ones afaik, but I know there's work being done on a few specialized ones (like medical knowledge-based), likely without government or corporate backing.
But I'm not sure about which ones train on broader encyclopedic knowledge. It's not something I've specifically checked into. If I had to guess though, only larger LLM's might bother with a solid coverage of knowledge-based information. IE 70b or 8x7b.