Skip to content
This repository has been archived by the owner on Jan 2, 2025. It is now read-only.

Support for custom OpenAI-Compatible backends #1231

Open
keinsell opened this issue Feb 11, 2024 · 3 comments
Open

Support for custom OpenAI-Compatible backends #1231

keinsell opened this issue Feb 11, 2024 · 3 comments
Labels
feature A new feature

Comments

@keinsell
Copy link

keinsell commented Feb 11, 2024

What's the problem?

Usage of local large language models through LocalAI, LMStudio and so on, all of these are providing OpenAI-compatible API, but application need to expose a setting to change baseurl.

What's the outcome?

  • Ability to use Azure OpenAI services, for people who want alternative hosting to OpenAI.
  • Ability to use custom large language models (ex. local ones) that are compatible with OpenAI API.

Related Issues: #415 #1094

@keinsell keinsell added the feature A new feature label Feb 11, 2024
@rodion-m
Copy link

rodion-m commented Feb 23, 2024

An ability to use custom backends is really required. Azure OpenAI supporting as well. Even as a paid feature.

@recursionbane
Copy link

Yes, AzureOpenAI support, please.

@ggordonhall
Copy link
Contributor

We've open-sourced the OpenAI API logic (https://github.com/BloopAI/bloop/tree/oss/server/bleep/src/llm). You can now build and run bloop with a custom OpenAI API key (see: https://github.com/BloopAI/bloop?tab=readme-ov-file#building-from-source).

Feel free to open a PR adding support for Azure OpenAI 😀

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
feature A new feature
Projects
None yet
Development

No branches or pull requests

4 participants