You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
According to the docs, uv cache prune --ci "removes all pre-built wheels and unzipped source distributions from the cache".
Our project uses a few large libraries which take a long time to download from PyPi. For example, torch and it's dependencies (several nvidia packages) make up 2 GB of files that need to be downloaded (takes ~5 minutes to download and install with uv). Would it be possible to make the --ci flag not clear very large downloaded packages? Otherwise, an argument could be added to uv cache prune that doesn't clear libraries over a given size?
The text was updated successfully, but these errors were encountered:
If we do anything here, the CLI flag for a size limit is probably a good idea either way, as an override for any behaviour we pick.
Out of curiosity, did you measure how long it takes to upload+download everything from your cache (e.g. if you don't use prune --ci)? That kind of information is the crux of whether this will actually improve performance to do this.
According to the docs,
uv cache prune --ci
"removes all pre-built wheels and unzipped source distributions from the cache".Our project uses a few large libraries which take a long time to download from PyPi. For example, torch and it's dependencies (several nvidia packages) make up 2 GB of files that need to be downloaded (takes ~5 minutes to download and install with uv). Would it be possible to make the
--ci
flag not clear very large downloaded packages? Otherwise, an argument could be added touv cache prune
that doesn't clear libraries over a given size?The text was updated successfully, but these errors were encountered: