It’s not even a lack of critical thinking skills, necessarily. Companies don’t exactly highlight the fact that their AIs are prone to hallucinating. I’d be willing to bet actual money that a lot of users aren’t even aware that that’s a possibility.
Companies don’t exactly highlight the fact that their AIs are prone to hallucinating.
Funny enough, if you question Gemini enough, it will eventually cave and admit that it’s answers aren’t always right and that it’s still improving. LOL
The problem is, as I suggested, you already need to know what the correct answer is in order to effectively cross-examine what it told you.
It’s not even a lack of critical thinking skills, necessarily. Companies don’t exactly highlight the fact that their AIs are prone to hallucinating. I’d be willing to bet actual money that a lot of users aren’t even aware that that’s a possibility.
Funny enough, if you question Gemini enough, it will eventually cave and admit that it’s answers aren’t always right and that it’s still improving. LOL
The problem is, as I suggested, you already need to know what the correct answer is in order to effectively cross-examine what it told you.