In light of the recent update regarding the removal of the rsync option, I was wondering whether the option to provide access over BitTorrent was considered.
I would imagine that the team that handles the infrastructure may have thoughts on the topic, but my immediate reaction is that it's probably not of significant value for the actual OpenWrt images because:
They are generally fairly small files (especially given that they target devices with as little as 8MB flash storage).
There are thousands of supported devices and they all require bespoke images, so the individual files for a specific piece of hardware may not be circulating within the torrent.
This doesn't help with customized images that need to be built by an ASU server.
Compare that against the standard Ubuntu image which is ~8000 times larger (5.9GB for Ubuntu 24.04.3 LTS Desktop x86 vs ~8MB for a common embedded device default image) and, in the case of Ubuntu, the single install image is common across the vast majority of x86 hardware.
That said, it could be really useful for the development environment for OpenWrt... the SDK is much larger and a much smaller number of target architectures relative to target devices.
Honestly, I don't know. There are mirrors, but maybe they don't take enough of the load while simultaneously causing load to keep them up to date. The email list thread that you linked talked about mirroring only the stable releases (not snapshot) in an effort to reduce the load.
But really, I'm not in a good position to answer the question since I'm not responsible for any of the infrastructure considerations.
As an enduser, why would you use rsync for your private images?
The online imagebuilder can only fetch from the official locations
The offline imagebuilder is configured for the official repositories, changing that is possible, but not really a documented approach (and needs a local webserver on your end)
Pretty much anything I can think of needs at most 2-5 files (firmware image and maybe some kernel modules to get online), under 20 MB in total.
Keep in mind, the content of the snapshot repositories changes once daily, even the stable release feeds change regularly, so there's little to be gained from a local cache of the public repositories.
If you build from source yourself, you will have to build 'everything' anyways.
"No one" (famous last words) needs to mirror everything (unless they're offering a public mirror) - and de-/selecting what you do want from a torrent is more work than basically any other imaginable method, all for 2-5 files at <20 MB total.
Yes, rsync is a great method and it's sad that it has to be disabled, blame the ai companies, that's why we can't have good things.
rsync isn’t being turned off, it’s being limited to several “tier 1 clients”. You can then rsync from them.
We did torrents in Gargoyle for a while, it doesn’t work. Images are produced so frequently and you don’t end up with enough peers so the brunt of the bandwidth ends up falling on the main host anyway.
For releases it might work… but why would I download multiple GBs of images just to get the one that I want? I’m not that generous honestly.
Maybe I'm overthinking this, but when I read about disabling of rsync, the first thought I had was “ok, they were trying to use rsync because the main server is overloaded with downloads, and rsync is also overloading it”.
Is this thinking wrong and the download server is doing fine with the end user downloads?