Replies: 3 comments
-
Firstly thanks for the feedback! Secondly, thanks for the tip about the model Claude, I actually didn't know him. However, from what little I researched, it seems to me that the Anthropic API is not compatible with the OpenAI API, that is, to support the Anthropic API, compatibility with this API must be implemented in the extension, which includes having support for two APIs and perform various tests, as the extension was built with the OpenAI API in mind. This means hours of implementation that I unfortunately don't have available at the moment. In the future I may think about integrating the Anthropic API, as well as other models such as Lhama of Meta. However, if someone is willing to add support for other APIs, I will not oppose it. |
Beta Was this translation helpful? Give feedback.
-
What about supporting other apis, such as ollama? Then we would have a choice of LLMs to use, all sitting behind the ollama api. It is my understanding that ollama's api is supposed to be openai compatible. |
Beta Was this translation helpful? Give feedback.
-
@kkohler2, it's already supported, I put this on readme: "Is possible to use a service that is not the OpenAI or Azure API, as long as this service is OpenAI API compatible. This way, you can use APIs that run locally, such as Meta's llama, or any other private deployment (locally or not). To do this, simply insert the address of these deployments in the Base API URL parameter of the extension. It's worth mentioning that I haven't tested this possibility for myself, so it's a matter of trial and error, but I've already received feedback from people who have successfully doing this." As I said above, I never tested by myself, and I don't know if someone already has tested with llama specifically, but I think that will work. |
Beta Was this translation helpful? Give feedback.
-
First of all, great work guys. I love the VS extension and it's amazing. Here is a idea for some improvement.
Additional model
Claude 3 Opus which has equal performance to GPT-4
https://www.anthropic.com/news/claude-3-family
The default version of Claude 3 Opus has a context window of 200,000 tokens. More code to feed in :)
It seems that the API does not have country restriction as the GUI version. When you sign up for the API. you get some free credits to try it out.
https://www.anthropic.com/api
Beta Was this translation helpful? Give feedback.
All reactions