This content originally appeared on DEV Community and was authored by Jijun
It is a proxy server that forwards Copilot requests to OpenAI API compatible LLM endpoints. You can find the proxy server and instructions here: https://github.com/jjleng/copilot-proxy. Only briefly tested, bugs might exist.
My Motivations of building the tool
- I'm already familiar with and enjoy using the GitHub Copilot extension (yes, I know there are other awesome extensions, such as Continue.).
- Copilot may not always utilize the latest GPT models. It currently use models like gpt-4-0125-preview, gpt-3.5-turbo and others.
- Transferring code from the editor to ChatGPT to use GPT-4o is inconvenient.
- I'm interested in using alternative models such as Llama3, DeepSeek-Coder, StarCoder, and Sonnet 3.5.
- I have subscriptions to both ChatGPT and Copilot but would like to cancel my Copilot subscription.
This content originally appeared on DEV Community and was authored by Jijun
Jijun | Sciencx (2024-07-15T21:28:31+00:00) Make GitHub Copilot with any LLM models. Retrieved from https://www.scien.cx/2024/07/15/make-github-copilot-with-any-llm-models/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.