GitHub - Stream29/ProxyAsLocalModel: Proxy remote LLM API as Ollama and LM Studio, for using them in JetBrains AI Assistant
ProxyAsLocalModel
Proxy remote LLM API as Local model. Especially works for using custom LLM in JetBrains AI Assistant.
Powered by Ktor and kotlinx.serialization. Thanks to their no-reflex features.
Story of this project
Currently, JetBrains AI Assistant provides a free plan with very limited quotes. I tried out and my quote ran out quickly.
I already bought other LLM API tokens, such like Gemini and Qwen. So I started to think of using them in AI Assistant. Unfortunately, only local models from...
Read more at github.com