Hacker News new | past | comments | ask | show | jobs | submit login

Is it? I'm using LXC containers, but that mostly because I don't want to run VMs on my devices (not enough cores). I've noted down the steps to configure them if I ever have to redo it so I can write a shell script. I don't see the coordination problem if you choose one distro as your base and then provision them with shell scripts or ansible. Shipping a container instead of a build is the same as building desktop apps instead of electrons, optimizing for developer time instead of user resources.



> if you choose one distro as your base

Yes obviously if you control the whole stack then you don't really need containers. If you're distributing software that is intended to run on Linux and not RHEL/Ubuntu/whatever then you can't rely on the userspace or packaging formats, so that's when people go to containers.

And of course if part of your infrastructure is on containers, then there's value in consistency, so people go all the way. It introduces a lot of other problems but you can see why it happens.

Back in around 2005 I wasted a few years of my youth trying to get the Linux community on-board with multi-distro thinking and unified software installation formats. It was called autopackage and developers liked it. It wasn't the same as Docker, it did focus on trying to reuse dependencies from the base system because static linking was badly supported and the kernel didn't have the necessary features to do containers properly back then. Distro makers hated it though, and back then the Linux community was way more ideological than it is today. Most desktops ran Windows, MacOS was a weird upstart thing with a nice GUI that nobody used and nobody was going to use, most servers ran big iron UNIX still. The community was mostly made up of true believers who had convinced themselves (wrongly) that the way the Linux distro landscape had evolved was a competitive advantage and would lead to inevitable victory for GNU style freedom. I tried to convince them that nobody wanted to target Debian or Red Hat, they wanted to target Linux, but people just told me static linking was evil, Linux was just a kernel and I was an idiot.

Yeah, well, funny how that worked out. Now most software ships upstream, targets Linux-the-kernel and just ships a whole "statically linked" app-specific distro with itself. And nobody really cares anymore. The community became dominated by people who don't care about Linux, it's just a substrate and they just want their stuff to work, so they standardized on Docker. The fight went out of the true believers who pushed against such trends.

This is a common pattern when people complain about egregious waste in computing. Look closely and you'll find the waste often has a sort of ideological basis to it. Some powerful group of people became subsidized so they could remain committed to a set of technical ideas regardless of the needs of the user base. Eventually people find a way to hack around them, but in an uncoordinated, undesigned and mostly unfunded fashion. The result is a very MVP set of technologies.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: