Just had this idea pop up in my mind. Instead of relying on volunteers mirroring package repositories all around the world, why not utilise BitTorrent protocol to move at the very least some some load unto the users and thus increase download speeds as well as decrease latency?
BitTorrent would likely increase latency, not lower it. The bit torrent protocol is very inefficient for small files and large numbers of files (https://wiki.debian.org/DebTorrent - see “Problems”).
But I think your question is more “why not use p2p to download files” for which I think the answer is likely “because they don’t need to.” It would add complication and overhead to maintain. An FTP/HTTP server is pretty simple to setup / maintain and the tools already exist to maintain them. You can use round-robin DNS to gain some redundancy and a bit of load spread without much effort either.
Most common/relevant/larger distros do that at least for the install/live ISO.
One reason is privacy and hence security. If you share a package, you also share the information that your system contains the oudtated package “xy” which has a backdoor and can be accessed by a hacker.
I’m not sure if that is a valid argument with atomic image distros since you share the whole image. And the tracker could just disable the old image as soon as the new image arrives.
Nix has an open issue on integrating IPFS support.
There’s also an old tutorial.
Guix supports now. As does nerdctl of oci things
Because HTTP is simpler, faster, easier, more reliable.
The motivation for a a lot of p2p is to make it harder to shut down, but there is no danger of that for Linux distros. The other would be to save money, but Debian/Arch/etc. get more than enough bandwidth/server donations, so they’re not paying for that anyway.
There is an apt variant that can do this, but nobody uses it. BitTorrent isn’t great for lots of small files overhead wise.
IPFS is better for this than torrents. The question is always “how much should the client seed before they stop seeding and how longs should they attempt to seed before they give up”. I agree something like this should exist, I have no problem quickly re-donating any bandwidth I use.
That’s actually a really interesting idea. Windows even does something, or at a point did something, similar with system updates.
Peer to peer packages would have some privacy, and potential security issues of course but I like the thought
Over time I’ve seen several groups tinker with p2p protocols for packages. Latest using gnunet/ipfs for Guix packages. But I’ve never seen a working/integrated system. Weird…
Metallica ruined it. They made it seem as though torrenting was evil because their content was being downloaded. Poor babies.
Lars ruined Napster. BitTorrent came around some time later after Limewire, Soulseek, and DirectConnect. Lars might have had something to say about Bit Torrent, but by that point no one was listening.
Besides, back then, we really were using BitTorrent mostly for Linux ISOs. At the time it was more reliable than http. It really sucked having to download an entire ISO again because it failed the checksum. BitTorrent alleviated that.
FWIW the “opposite”, namely Webseed, exists http://bittorrent.org/beps/bep_0019.html so… maybe some already do but it’s not even noticed because Wedseed of mirrors handle the load?
Another thing not mentioned yet is maintenance overhead. These distros operate around the clock, all over the world, with talent from the likes of RH and co. There are far fewer people (who run your mirrors) who know how to maintain a torrent tracker (or similar), and on top of that, I haven’t really seen any good BitTorrent caching methods. Support would need to be added to your package manager of choice.
It also comes down to most client having asymmetric bandwidth, and that most users do not have every package installed and therefore can only distribute a very small amount of the total distro. Those users probably don’t want to be constantly uploading, either. I also can’t imagine torrents are too fun to work with when it comes to distributing constantly changing package manager metadata, too.
At least Kali and Arch do
Doesn’t Arch rely on mirrors to distribute packages?
You’re right - I misunderstood the question and thought you meant the distribution images