Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump certifi from 2024.2.2 to 2024.7.4 #19

Closed
wants to merge 35 commits into from

Conversation

dependabot[bot]
Copy link

@dependabot dependabot bot commented on behalf of github Jul 18, 2024

Bumps certifi from 2024.2.2 to 2024.7.4.

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    You can disable automated security fix PRs for this repo from the Security Alerts page.

cebtenzzre and others added 30 commits July 18, 2024 16:01
This fixes a crash where ggml_vk_allocate fails in llama_kv_cache_init,
but the exception is never caught.
The correct way to indicate an OOM condition is for alloc_buffer to
return NULL. This fixes undefined behavior caused by passing an
exception over the C boundary.

The rest of the changes help fix VRAM leaks in GPT4All when model
loading fails on GPU.
We haven't implemented the necessary GPU kernels yet.

Fixes this crash:

ggml_vk_graph_compute: error: unsupported op 'ARGSORT'
GGML_ASSERT: /home/jared/src/forks/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml-kompute.cpp:1508: !"unsupported op"
These are Baichuan, Bert and Nomic Bert, CodeShell, GPT-2, InternLM,
MiniCPM, Orion, Qwen, and StarCoder.
This trades a late heap-use-after-free for an early abort, which feels
more correct.
This fixes a regression in commit b2db03a ("llama: replace ngl=0 hack
with llama_model_using_gpu").
cebtenzzre and others added 5 commits July 18, 2024 16:01
ggml_vk_get_tensor_aligned() returns a shared_ptr, not a reference, so
we must copy the value.
Eagerly freeing the instance when we are done with it is simple, but
incurs an overhead, and more importantly, causes test-backend-ops
crashes on the current proprietary NVIDIA driver.

Instead, we now only cleanup device resources without freeing the device
unless we actually need to change devices. And even when we free the
device, we do not free the instance. We only free the instance when both
the backend and all buffers have been unreferenced.
test-backend-ops hit assertion failures in ggml_vk_graph_compute because
of ops we do not yet support. Some of the checks have to be made more
restrictive because of features that were added to llama.cpp.

We also claimed to not support no-op operations on certain data types,
even though they are actually supported on all data types. There are now
243 passsing tests, instead of 150 without the fixes for false
negatives. This also fixes complaints during LLM inference about
unsupported NONE operations for the output tensor.
Bumps [certifi](https://github.com/certifi/python-certifi) from 2024.2.2 to 2024.7.4.
- [Commits](certifi/python-certifi@2024.02.02...2024.07.04)

---
updated-dependencies:
- dependency-name: certifi
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Jul 18, 2024
@cebtenzzre cebtenzzre force-pushed the master branch 2 times, most recently from 87863ac to 2bae44a Compare July 19, 2024 18:39
@cebtenzzre cebtenzzre closed this Aug 14, 2024
Copy link
Author

dependabot bot commented on behalf of github Aug 14, 2024

OK, I won't notify you again about this release, but will get in touch when a new version is available. If you'd rather skip all updates until the next major or minor version, let me know by commenting @dependabot ignore this major version or @dependabot ignore this minor version.

If you change your mind, just re-open this PR and I'll resolve any conflicts on it.

@dependabot dependabot bot deleted the dependabot/pip/certifi-2024.7.4 branch August 14, 2024 16:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant