Skip to content

Commit

Permalink
fix: cohere models running
Browse files Browse the repository at this point in the history
  • Loading branch information
xavidop committed Oct 23, 2024
1 parent 8c65e30 commit 088f230
Show file tree
Hide file tree
Showing 2 changed files with 9 additions and 5 deletions.
8 changes: 5 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -141,13 +141,15 @@ For more detailed examples and the explanation of other functionalities, refer t

This plugin supports all currently available **Chat/Completion** and **Embeddings** models from GitHub Models. This plugin supports image input and multimodal models.

Still in progress:
1. Support for image output models

## API Reference

You can find the full API reference in the [API Reference Documentation](https://xavidop.github.io/genkitx-github/)

## Troubleshooting

1. GPT o1-preview it is still in beta. It does not support system roles and the `temperature` and `topP` needs to be set to 1. See OpenAI annocement [here](https://openai.com/index/introducing-openai-o1-preview/)
2. Cohere models only supports text output for now. Issue opened [here](https://github.com/orgs/community/discussions/142364).

## Contributing

Want to contribute to the project? That's awesome! Head over to our [Contribution Guidelines](https://github.com/xavidop/genkitx-github/blob/main/CONTRIBUTING.md).
Expand Down
6 changes: 4 additions & 2 deletions src/github_llms.ts
Original file line number Diff line number Diff line change
Expand Up @@ -756,17 +756,19 @@ export function toGithubRequestBody(
`${response_format} format is not supported for GPT models currently`,
);
}
const modelString = (request.config?.version || model.version || modelName) as string;
const body = {
body: {
messages: githubMessages,
tools: request.tools?.map(toGithubTool),
model: request.config?.version || model.version || modelName,
model: modelString,
max_tokens: request.config?.maxOutputTokens,
temperature: request.config?.temperature,
top_p: request.config?.topP,
n: request.candidates,
stop: request.config?.stopSequences,
response_format: responseFormat,
// FIXME: coherence models don't support response_format for now
response_format: modelString.includes("cohere") ? "" : responseFormat,
},
} as any;

Expand Down

0 comments on commit 088f230

Please sign in to comment.