23
Condor
6y

Arch Linux is so overrated. Just a little while ago I did pacman -Syu and dhcpcd broke. Bleeding edge is all fine with me, but at least MAINTAIN THE FUCKING DISTRO PROPERLY!!!
Well, guess I'll have to redeploy that LXC with a different OS then. Probably Ubuntu Server or something like that.

Comments
  • 12
    I once updated and it broke curl which then broke pacman and I was stuck without a package manager.
  • 9
    Bleeding edge without blood would be cutting edge.
  • 6
    @Fast-Nop Whatever you want to call it.

    Cutting edge or bleeding edge, point is that the idea of latest software implies instability is wrong. Just like how Debian attempts to justify its ancientness with "stability", Arch's attempts at justifying its instability with cutting-edgeness is a fallacy.

    I'm no stranger to compiling custom kernels. Back when I used the Arch distribution kernels on my laptop, every now and then an upgrade to the distribution kernel would break the system. However, when I compiled the same version of the upstream kernel using my own config, it'd be stable.. and remain stable across all releases. At some point I even indulged with the rc releases (aka mainline) in order to be even more bleeding edge than Arch. And guess what.. it almost never created any issues.

    So bleeding edge is not an excuse. Their inability to compile a piece of software and do basic integrity checking with the rest of their system, that's the issue.
  • 3
    @Condor I think your last paragraph nailed it down. If you run comprehensive testing, it takes time. The more you test, the more time it takes.

    The extremes are Debian on the one side who test so long that by the time they consider it as passed, it is already ancient. The other extreme is just not to test.
  • 1
    If you want bleeding edge you have to accept the bleeding. If you're really conservative though, you could go with Slackware.
  • 5
    @Fast-Nop But how long would it take to check whether the compiled version still does what it's supposed to? Especially with command-line software that follows the Unix and KISS principles - do one job and do it well. In case of dhcpcd, getting a DHCP lease. It'd be safe to assume that even on a powerful compilation server, compiling would take far longer than running the result and seeing whether it works, before committing it to its respective repository (core, extra, community), wouldn't it?
  • 3
    @d4ng3r0u5 Arch isn't bleeding edge. It compiles the stable releases from upstream. It doesn't compile its development repos, those are reserved for AUR as -git packages. So no in Arch I do not want the bleeding. I want its maintainers to do it properly, because that's why I chose Arch, distribution packages that have been compiled for me. Failing that, I'd go with LFS and do it myself.
  • 2
    @Haxk20 I don't expect the code to be entirely read and verified or anything like that; That's a job reserved for upstream, which many of the developers there (especially for the crucial components like linux, glibc, grub, systemd etc) usually do take care of. Proof of that is in the many custom kernels that I build and never break.. and that without a fancy compilation server. What I do expect from Arch's maintainers is to take a couple of minutes after their compilation to see whether it does what it's supposed to, in the most basic sense.. by running it in a test environment, instead of mindlessly pushing it and breaking others' systems.
  • 4
    you can downgrade packages pretty easily, sudo pacman -U /var/cache/pacman/pkg/package-oldversion.pkg.tar.gz
  • 1
    Honestly I'm not sure what you expect. Because I for sure do not expect any Arch maintainers testing all packages in all combinations. Otherwise you would not have a rolling release distro, because while someone is testing some package X there will be a new release of package Y, so what are they supposed to do, retest all the packages against each other?

    My guess is that the main testing is done by the developers in a sense like "works for my configuration/my box, we can push it" and any issues which appear later will be reported by users. Yes this might and will break things, but is this unexpected when using Arch? So you either have to live with it or switch distros.
  • 3
    @RantSomeWhere yes and yes, but I don't want to mindlessly evangelise Arch. So when I find an issue like this, I rant about it.
  • 1
    @Condor If you're drunk, please keep that account alive :P

    I think I remember seeing that you deleted it in a moment of anger towards technology, sooo...

    Of course I'm just kidding, I hope you'll be able to fix your issue :)
  • 4
    Funny how the hardcore Arch fans are so quick to defend their masturbatory excercise in IT support, it's almost like they have nothing better to do with their time so they waste it trying to fix broken, untested shit.
    What the fuck is the point of continuous integration then? What is the point of testing? What if someone started releasing packages with malicious code in them swept under the rug?
    I bet 90% of them wouldn't ever notice until someone else told them, because they are so used to just automatically update everything without checking, that they don't even know what the update is about.
    Wtf is the point of all this? Don't you have other work to do with your pc?
  • 0
    Am I such a unicorn that my arch never broke? If it did, I'd switch immediately but it has been perfectly stable for me.
Add Comment