Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support configuring base_url #14

Open
hardliner66 opened this issue Jul 24, 2024 · 1 comment
Open

Support configuring base_url #14

hardliner66 opened this issue Jul 24, 2024 · 1 comment

Comments

@hardliner66
Copy link

It would be nice if we could configure the base url, then people could use offline models via ollama or similar tools.

@ParaplegicRacehorse
Copy link

ParaplegicRacehorse commented Sep 9, 2024

I agree. I run my own AI inference servers (Several LLMs, Two stable diffusions, a TTS with SR, a voice-clone, a music-gen) specifically to avoid sending data into the clutches of Sam Altman and other egomaniacs.

I would like to be able to use my own inference servers.

Even the AI Horde would be preferable to OAI.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants