The problem is that static libraries are actually more likely to break across time in practice, since "the system" is more than just "syscalls".
For example, in places where the a filesystem-related "standard" has changed, I have old static binaries that fail to start entirely, whereas the same-dated dynamic libraries just need to bother to actually install dependencies.
I am convinced that every argument in favor of static linking is because they don't know how to package-manager.
Nobody knows how to use the package manager. What happens in practice, is every single program uses the the package versions the distro happens to ship with.
If you want a newer version, too bad - your OS doesn't ship that so better luck in the next release. OR you can set up a private repo, and either ship a binary that has the dependencies included (shipping half the userland with your audio player), or they package the newer version of library, which will unwittingly break half your system, if not today, then surely at the next distro upgrade.
It speaks volumes of Linux package management woes, that no vendor ships anything analogous to brew or chocolatey.
- Packages are useful units of software in homebrew, and both end-user apps dependency libraries in apt (and weird stuff, like headers, or dependencies)
- Homebrew packages are installed for the user, apt is system-wide
- Homebrew packages are updated and maintained usually by people involved with the software they are shipping, apt packages are usually maintained by the distro
- As a result, Homebrew packages almost always work and are almost always up to date, apt ones are almost never (stuff like go, node etc.)
Apt is like a weird frankenstein monster of npm, the system update, and an app store, and all that with a global scope.
It's not all-or-nothing, though. If you need a dependency that isn't widely available on distros, then statically link it. It's fine. No big deal. But if you actually care about being a responsible maintainer, make sure you follow new releases of that dependency, and release new versions of your app (with the new version of the dependency) if there's an important bug fix to it.
If you're linking to libX11 or libgtk or something else that's common, rely on the distro providing it.
I really don't get all the anti-shared-library sentiment. I've been using and developing software for Linux for a good 25 years now, and yes, I've certainly had issues with library version mismatches, but a) they were never all that difficult to solve, and b) I think the last time I had an issue was more than a decade ago.
While I think my experience is probably fairly typical, I won't deny that others can have worse experiences than I do. But I'm still not convinced statically linking everything (or even a hybrid approach where static linking is more common) would be an overall improvement.
What also help is pacing. If your core dependencies is updating every week with API breaking changes, something like Debian is a poor choice if you don’t statically link. Some people don’t need the latest updates and are fine with security and bug fixes.
The sane thing here is to maintain a clear notion of what the "OS" is versus the "app", and use dynamic linking on that boundary, but not elsewhere. Which is more or less how Windows and macOS do things.
For example, in places where the a filesystem-related "standard" has changed, I have old static binaries that fail to start entirely, whereas the same-dated dynamic libraries just need to bother to actually install dependencies.
I am convinced that every argument in favor of static linking is because they don't know how to package-manager.