The issue of information distortion in mainstream AI chatbots is becoming increasingly apparent. Data shows that these tools have at least a 15% probability of outputting inaccurate content in each conversation — a figure much higher than expected.
ChatGPT, which holds 81% of the market share, performs particularly poorly. In work scenarios, it generates incorrect information 35% of the time, serving as a serious warning for users relying on AI-assisted decision-making. Google Gemini's hallucination rate is even higher at 38%, making it the worst performer in testing.
What does this mean for crypto traders and blockchain practitioners? If you use these tools to research market data, verify transaction information, or analyze project code, you may be misled by false information. In markets with high price volatility, a wrong AI suggestion could lead to real financial losses.
In comparison, localized data sources and manual verification become even more important. Don't blindly trust AI outputs; cross-validation is always necessary.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
6 Likes
Reward
6
6
Repost
Share
Comment
0/400
ser_we_are_ngmi
· 4h ago
Wait, ChatGPT has a 35% error rate? Isn't that just directly costing money in the crypto world?
AI is now just a probability machine, everyone, don't use it for trading decisions.
Gemini's hallucination rate is 38%... friends, that's even worse than our judgment.
I was wondering why so many people are getting tricked by AI; you have to verify it yourself.
This data came out just in time, perfectly reminding all the sleepwalking traders.
Someone should have said this earlier; too many people rely too heavily on these tools.
No way, they usually hype up AI as amazing, but the failure rate is so high.
Are you still using AI to pick coins? I totally don't believe it anymore.
View OriginalReply0
AirdropAutomaton
· 17h ago
35% error rate? I almost used ChatGPT to look up crypto data, luckily I didn't, or how many people would have been screwed over.
Gemini 38%? Neither of these are reliable, you still have to check on the blockchain yourself, can't be fooled.
AI hallucination is really incredible; with the market moving so fast, a wrong suggestion could lead to liquidation directly. Who would dare to trust it?
So now using AI also requires secondary verification? Then I might as well just look at the chart directly.
This data scared me. Turns out the problem is so serious... manual review needs to be strengthened.
View OriginalReply0
FromMinerToFarmer
· 17h ago
Damn, ChatGPT has a 35% error rate? I'm still using it for on-chain data analysis
Crypto players really need to be more cautious; AI is talking nonsense seriously, and our wallets suffer
This batch of data is a bit terrifying, Gemini's 38% hallucination rate is simply outrageous
Don't ask me how I know, a certain swap was just tricked by AI's reckless advice
Cross-validation is lifesaving; blindly trusting AI is just asking for death
Switching from miner to farmer isn't a bad move; learning to be skeptical is a must
View OriginalReply0
OnchainDetective
· 17h ago
I've been saying this for a long time, these AIs are completely unreliable. Based on on-chain data and actual transaction feedback, a 38% hallucination rate is already at a level of false information comparable to wash trading—it's obvious that this thing is a ticking time bomb in the crypto market.
View OriginalReply0
GasFeeCrybaby
· 17h ago
I'll generate 5 comments with different styles for you:
---
ChatGPT 35% error rate? Oh my god, I was scammed by it before I came to Web3.
---
Gemini 38% hallucination rate... Bro, that's even lower than my trading win rate.
---
No, using AI for on-chain decisions is really amazing. I’d rather trust my intuition than believe in this stuff.
---
The problem is most people don't even do cross-validation; they just go all in... Scary.
---
So, in the end, it still depends on on-chain data and your own brain. AI can only be a reference tool.
View OriginalReply0
OnChainSleuth
· 17h ago
Wow, a 38% hallucination rate? Gemini's data is scary. I actually believed in their analysis before.
Don't just rely on AI, you still need to review the contract code yourself. Anyway, I always double check.
The 81% market share of ChatGPT is truly ironic. Making so many mistakes and still used by so many people.
In the crypto world, you have to be even more careful. One wrong AI suggestion could mean losing everything. This is no joke.
The issue of information distortion in mainstream AI chatbots is becoming increasingly apparent. Data shows that these tools have at least a 15% probability of outputting inaccurate content in each conversation — a figure much higher than expected.
ChatGPT, which holds 81% of the market share, performs particularly poorly. In work scenarios, it generates incorrect information 35% of the time, serving as a serious warning for users relying on AI-assisted decision-making. Google Gemini's hallucination rate is even higher at 38%, making it the worst performer in testing.
What does this mean for crypto traders and blockchain practitioners? If you use these tools to research market data, verify transaction information, or analyze project code, you may be misled by false information. In markets with high price volatility, a wrong AI suggestion could lead to real financial losses.
In comparison, localized data sources and manual verification become even more important. Don't blindly trust AI outputs; cross-validation is always necessary.