Rongchai Wang
Jan 17, 2026 09:16
GitHub introduces rate limiting for Actions cache entries at 200 uploads per minute per repository, addressing system stability concerns from high-volume uploads.
GitHub has implemented a new rate limit on its Actions cache system, capping uploads at 200 new cache entries per minute for each repository. The change, announced January 16, 2026, targets repositories that were hammering the cache system with rapid-fire uploads and causing stability problems across the platform.
Downloads remain unaffected. If your workflows pull existing cache entries, nothing changes. The limit specifically targets the creation of new entries—a distinction that matters for teams running parallel builds that generate fresh cache data.
Why now? GitHub cited “cache thrash” as the culprit. Repositories uploading massive volumes of cache entries in short bursts were degrading performance for everyone else on the shared infrastructure. The 200-per-minute cap gives heavy users enough headroom for legitimate use cases while preventing the kind of abuse that was destabilizing the system.
Part of a Broader Actions Overhaul
This rate limit arrives amid several significant changes to GitHub Actions economics. Earlier this month, GitHub reduced pricing on hosted runners by 15% to 39% depending on size. But the bigger news hits March 1, 2026, when self-hosted runner usage in private repos starts costing $0.002 per minute—a new charge that’s pushing some teams to reconsider their CI/CD architecture entirely.
The cache system itself got an upgrade in late 2025, with repositories now able to exceed the previous 10 GB limit through pay-as-you-go pricing. Every repo still gets 10 GB free, but heavy users can now buy more rather than constantly fighting eviction policies.
What Teams Should Check
Most workflows won’t notice this limit. But if you’re running matrix builds that generate unique cache keys across dozens of parallel jobs, do the math. A 50-job matrix completing simultaneously could theoretically hit 200 cache uploads in under a minute if each job creates multiple entries.
The fix is straightforward: consolidate cache keys where possible, or stagger job completion if you’re genuinely bumping against the ceiling. GitHub hasn’t announced any monitoring dashboard for cache upload rates, so teams concerned about hitting limits will need to audit their workflow logs manually.
Image source: Shutterstock








