Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for litellm provider #147

Closed
sa- opened this issue May 18, 2024 · 1 comment
Closed

Support for litellm provider #147

sa- opened this issue May 18, 2024 · 1 comment
Assignees

Comments

@sa-
Copy link

sa- commented May 18, 2024

What feature(s) would you like to see?

https://github.com/BerriAI/litellm is a popular project that is used as a model router within companies. Adding support for this would be great, and I'm happy to contribute with some support!

Additional information

It would also mean that the UI could be used with other models. This could increase Cohere's popularity but I understand if it means that Cohere might not be interested in supporting this feature

@elaineg elaineg self-assigned this May 21, 2024
@elaineg
Copy link
Contributor

elaineg commented May 22, 2024

Hi, thanks for the note! You are welcome to add custom models to the toolkit. You can follow our documentation to add a custom model deployment here. You can see we've even added models available via llama-cpp as an example.

@elaineg elaineg closed this as completed May 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants