-
How can I switch to my video card? I want the answers to be faster. |
Beta Was this translation helpful? Give feedback.
Answered by
LostRuins
Apr 7, 2023
Replies: 1 comment
-
This tool currently only focuses on CPU inference, you may want to check out this repo instead for GPU inference for llama: https://github.com/0cc4m/KoboldAI |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
LostRuins
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
This tool currently only focuses on CPU inference, you may want to check out this repo instead for GPU inference for llama: https://github.com/0cc4m/KoboldAI