Good news! Ollama has completed compatibility adaptation with Anthropic's Claude Code API.
Developers can now directly call Ollama's local and online models within the Claude Code environment. However, the current limitation is that only models with a context window of over 64K are supported.
How to use it specifically? The second image provides detailed integration instructions and configuration steps. For those who want to quickly switch models for development and testing locally or in the cloud, this update is worth paying attention to.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
7 Likes
Reward
7
4
Repost
Share
Comment
0/400
LiquidityOracle
· 4h ago
The 64K window limit is a bit annoying, but being able to switch directly between local and online models is still pretty cool.
View OriginalReply0
TopBuyerBottomSeller
· 4h ago
It has to be above 64K, the small model is also stuck. When will it become accessible to the general public?
View OriginalReply0
RooftopReserver
· 4h ago
The 64K window is a bit of a hurdle, and small to medium models are directly filtered out.
View OriginalReply0
LeverageAddict
· 4h ago
You need a 64K window to use this, which is a bit of a high threshold. Many models get stuck outside.
Good news! Ollama has completed compatibility adaptation with Anthropic's Claude Code API.
Developers can now directly call Ollama's local and online models within the Claude Code environment. However, the current limitation is that only models with a context window of over 64K are supported.
How to use it specifically? The second image provides detailed integration instructions and configuration steps. For those who want to quickly switch models for development and testing locally or in the cloud, this update is worth paying attention to.