Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I am trying to get this to work on Linux (Ubuntu 24.04, Zotero 7.04) #9

Open
mjeltsch opened this issue Nov 26, 2024 · 3 comments
Open

Comments

@mjeltsch
Copy link

I am trying to get this to work on Linux (Ubuntu 24.04, Zotero 7.04), but it seems not to accept the license key that I received via email. In any case, whatever model I choose, it complains about an invalid_api_key...

@papersgpt
Copy link
Owner

If you choose the models of OpenAI, Anthropic or Gemini, you should set your own OpenAI, Anthropic or Gemini API KEY. The Local LLMs are all free, however they are just be accessed on MacOS now, as Apple chips are best for running LLMs locally.

@mjeltsch
Copy link
Author

Any specific Apple chip that is required? Or would it work on my old Intel Mac (last one before they switched to their own Chips)?

@ljeagle
Copy link
Contributor

ljeagle commented Nov 26, 2024

Any Apple chip is ok. The difference is that the inference performance will be much better on the M series chip, as Apple made a lot of significant improvements for combining GPU with CPU.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants