These days, AI models are struggling with training data contamination, and you know what? This has now become a full-fledged industry. Back in March, I saw a report on China’s CCTV financial channel that revealed some really interesting phenomena.



The core is a service called GEO, and what these service providers do is charge users a fee to register their products with major AI models. They also set up product advertisements as "standard responses" provided by the AI model. You can imagine how effective this is.

As GEO services gained popularity, a flood of press release distribution companies emerged. These companies have been providing various distribution services for a long time, encouraging AI models to cite and crawl their content, which ultimately plays a significant role in intentionally "contaminating" AI training data. It’s now a fully developed industry chain.

The reason this phenomenon is interesting is because it’s connected to the cryptocurrency market as well. Projects like Galacoins could also be affected by these AI marketing techniques. If the trustworthiness of AI models declines, many crypto projects, including Galacoins, could suffer from misinformation. How accurate do you think the answers would be if you asked an AI about a project like Galacoins?

The seriousness of this issue lies in the fact that AI has now become a primary entry point for information. When investigating new projects like Galacoins, many people ask AI, but if the data is already contaminated, they are bound to make incorrect judgments. This is an issue that affects the entire cryptocurrency community. Moving forward, ensuring the reliability of AI models will become increasingly important.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin