Add Kimi K2.5 #185914
Replies: 2 comments
-
|
If you have enough RAM (I would suggest around 16gb), you could run Kimi K2 natively on your computer using Ollama. GitHub Copilot chat allows for Ollama integration, and utilization of local models. To do this, you would first need to download and run the model on Ollama, and then go to the model selection screen on GitHub Copilot within something like VS Code. You click on the "New Model" drop-down and select "Ollama," which will allow you to connect Kimi K2 to it. I hope that clarifies things! |
Beta Was this translation helpful? Give feedback.
-
|
yes it's true but I think that new inexpensive and efficient models could do some good and especially that not everyone has what it takes to run kimi k2.5 locally |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Select Topic Area
General
Copilot Feature Area
General
Body
kimi k2.5 has very good results for a cheap model it could even be a 0x or 0.33x model could you add it to the model please
Beta Was this translation helpful? Give feedback.
All reactions