Many times, when the model is so stupid, it's very likely not a performance issue but a security problem.


Otherwise, it wouldn't happen that as soon as you mention it briefly, the model says it knows to use this,
so why didn't it say so earlier? When you don't give it prompts, it always pretends to be clueless, waiting for you to guide and activate its knowledge base.
But if it really said it early on, even if it's correct, most people would probably be very annoyed,
the difference between being a leader and being led is still very significant.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin