You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think that's a fairly solid point - however, I'm not sure it's that useful to manually compress/optimise them via a PR. It'd be better if we could automate this somehow - any ideas?
We could setup a github action workflow that compresses any new image, I guess. I think what we should be looking at is having smaller copies of the same image though, like some 64x64 and 128x128.
I've had this in my notifications inbox for three months now and I don't want this to fade into obscurity, so here's a tracking issue to get the ball rolling in terms of discussion.
The text was updated successfully, but these errors were encountered:
After giving it more thoughts, I think it would be best to have a pre-commit compressing and downscaling images, and GitHub Actions checking that it has been ran.
This way we can make sure no unoptimized images or images without downscale are being committed.
That said, I still would like to reiterate that a repository like this will be massive to download. Git isn't made for such data. Suggesting shallow clones to be made (one that lazily fetches older revisions, so effectively only the last version of each file is downloaded ahead of time) would in my opinion be more sensible.
Random guy here with absolutely no context; I think one easy option might be imgbot. Yes, it does create a new PR, but it's mostly automatic, and only triggers when something relevant changes.
As was mentioned on #8:
I've had this in my notifications inbox for three months now and I don't want this to fade into obscurity, so here's a tracking issue to get the ball rolling in terms of discussion.
The text was updated successfully, but these errors were encountered: