Good news! Ollama has completed compatibility adaptation with Anthropic's Claude Code API.



Developers can now directly call Ollama's local and online models within the Claude Code environment. However, the current limitation is that only models with a context window of over 64K are supported.

How to use it specifically? The second image provides detailed integration instructions and configuration steps. For those who want to quickly switch models for development and testing locally or in the cloud, this update is worth paying attention to.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • 4
  • Repost
  • Share
Comment
0/400
LiquidityOraclevip
· 4h ago
The 64K window limit is a bit annoying, but being able to switch directly between local and online models is still pretty cool.
View OriginalReply0
TopBuyerBottomSellervip
· 4h ago
It has to be above 64K, the small model is also stuck. When will it become accessible to the general public?
View OriginalReply0
RooftopReservervip
· 4h ago
The 64K window is a bit of a hurdle, and small to medium models are directly filtered out.
View OriginalReply0
LeverageAddictvip
· 4h ago
You need a 64K window to use this, which is a bit of a high threshold. Many models get stuck outside.
View OriginalReply0
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)