Been testing Grok for quick queries, but the response time is honestly brutal. A straightforward prompt took over 111 seconds to process—that's way too long for simple questions. Really wondering what's happening on the backend here. Meanwhile, ChatGPT handles the same type of queries in seconds. The speed gap between them is pretty noticeable. Anyone else experiencing this lag, or is it just me? Makes you think about where the optimization priorities are.
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
12 Likes
Reward
12
7
Repost
Share
Comment
0/400
FunGibleTom
· 1h ago
111 seconds?? Grok's speed is just too outrageous
ngl, after using it for so long, ChatGPT is still more stable. Grok's optimization is a bit disappointing
The backend is probably slacking off...
View OriginalReply0
GhostAddressHunter
· 19h ago
grok's reaction speed is really amazing, 111 seconds? I thought the network was down.
---
ChatGPT responds instantly, grok takes two minutes, this gap is a bit outrageous.
---
111 seconds to handle a simple problem, what's really going on with the backend? It seems like optimization isn't on the agenda.
---
The same problem GPT solves instantly, grok is painfully slow, no wonder no one uses it.
---
It's not just you, I've encountered it too. grok is just like that, slow as a turtle.
---
The gap in optimization ability is quite large, it feels like grok is still in the debugging stage.
View OriginalReply0
pumpamentalist
· 01-11 22:15
Grok, that speed is really amazing, 111 seconds? I thought there was a bug.
View OriginalReply0
HappyMinerUncle
· 01-11 22:00
111 seconds? That's too outrageous. I thought my internet was down.
Grok's speed is really unbearable; compared to ChatGPT, it's completely crushed.
Can the backend optimization be a bit more attentive? Who would dare to use it like this?
It feels like they just forcibly released a beta version, and the experience is extremely terrible.
View OriginalReply0
ReverseTradingGuru
· 01-11 21:58
111 seconds? Man, your internet must be really bad.
---
grok really sucks, I've tried it too, probably the backend is just slacking off.
---
This performance gap... Elon needs to reflect on it.
---
ChatGPT responds in seconds, grok takes half a day, there's no comparison.
---
It's already 2024 and still this slow, unbelievable.
---
It's not your fault, grok just has this kind of performance.
---
Backend optimization? I think it hasn't been optimized.
---
Waiting 111 seconds is better than manually typing the answer yourself.
View OriginalReply0
ForkMaster
· 01-11 21:57
111 seconds? Bro, what are you doing? Mining? Grok, this speed really... Did the project team prioritize optimization backwards? I have three kids, and I've never experienced such lag. The betting agreement must be added; with this response time, it's just unusable.
View OriginalReply0
ser_aped.eth
· 01-11 21:49
111 seconds? Man, that's so slow. GPT can produce results in seconds, while Grok takes two minutes. The gap is really huge.
Been testing Grok for quick queries, but the response time is honestly brutal. A straightforward prompt took over 111 seconds to process—that's way too long for simple questions. Really wondering what's happening on the backend here. Meanwhile, ChatGPT handles the same type of queries in seconds. The speed gap between them is pretty noticeable. Anyone else experiencing this lag, or is it just me? Makes you think about where the optimization priorities are.