Linux sucks, and there’s nothing you can do about it

Recently I started working on a startup tech business to implement some of my tech ideas. I got a work-space on campus at University of Memphis and they gave me a desk and computer to work on in a shared work-space area where other entrepreneurs are working. The computer they provided is an iMac, and it is configured to work with the campus network, and forces me to sign in using my University credentials.

In order to do my work, however, I have to have complete control over the computer to be able to install software to develop programs, so my idea was to bring an external drive with me and install a Linux distribution onto the external drive. I pulled a solid-state drive out of my desktop at home and an SSD-to-USB adapter and installed a copy of Linux Mint (which is based on Ubuntu (which is based on Debian)).

I used the iMac to install the OS, and under normal circumstances, this should be no problem. Ideally, the installer (running off of a flash drive) would run in RAM and not modify the internal drives on the iMac. Installing the OS would put the bootloader onto the external drive and put the OS installation in a separate partition after the bootloader.

Unfortunately this did not go as planned. The installer gives users the option of either installing Mint the easy way, and telling Mint to install to a specific drive, or the hard way, where the user has to know how to repartition the drive according to the needs of the OS, which requires knowing what the OS is going to do before it does it, which is a catch-22. So I picked the easy option, and I selected LVM format (Logical Volume Management) as the partition type, so that I can resize partitions easily after installing. LVM allows the user to create other partition types inside of it, which are then able to be resized easily and moved to other drives that are also formatted with LVM.

In the advanced mode, it is impossible to set the system up with LVM, as it does not allow creating partitions inside of an LVM partition.

In the easy installation mode, I selected the external drive, and Mint’s installer put the OS on the selected drive, but then it put the bootloader on the internal hard-drive on the system. It does this because the internal drives are marked as “bootable” (boot flag is set). The installer assumes that the user must want to dual-boot, so it replaces the internal bootloader. Since the iMac boots using EFI (or UEFI, which is newer), the bootloader is actually booted from the NVRAM, which is a special memory piece on the motherboard that boots devices and remembers the default device. The Mint installer decided to delete this NVRAM and replace it with its own information pointing to my external drive’s OS installation. The OS installation then would point to the internal drive’s bootloader partition. The bootloader would then let me choose which OS to boot.

Installer actions:

  • Overwrote NVME
  • Overwrote bootloader on internal drive
  • Pointed NVME to external drive
  • Pointed external drive back to internal drive’s bootloader

The dysfunctional configuration:

NVME --> external drive --> internal drive

This is *incredibly* stupid as a default. I reported this to Linux Mint on their forums, but Mint can’t fix this, because they rely on Ubuntu to provide the installer code, so this has to be reported again to Ubuntu, but they might not even fix it, even though it is an obvious logic error that has an easy fix.

Boot flag work-around

The internal drive can be set to “unbootable” during installation as a work-around for this installer bug. To do this, open up GParted and change the flags before installing. After installing, reboot into the Linux Live CD (or installed OS) and change the flags back.

Fixing the University’s Computer

I was unable to fix the University’s computer on my own. After several hours of research, the only fix to restore the NVRAM and bootloader involved logging in as an admin on the MacOS installation and running the “Startup disk” selection program and clicking on the internal drive to repair it. It requires administrator privileges. The only other option was to re-install the operating system, and this meant giving it back to the tech people, who would take 2 weeks to figure out a 2 minute problem and then probably still re-install MacOS.

Most operating systems allow users to fix bootloaders by running a tool from an installer CD or USB drive. There is no such tool for MacOS.

Luckily, I managed to get someone who knew the admin password to help me fix the computer.

After Installation

After installation, Linux Mint seemed pretty good. I have tried many other distros and had many issues after installing. Mint seemed to avoid most of them, but a few major ones showed up.

First, Mint was unable to install certain programs from the App store (Software manager) including Discord and another program, due to a missing software package that these two programs relied on. Later on this problem went away (I can’t remember why), but this was a problem out of the box.

The other major problem is the drivers. I intended to use this SSD on a few different computers, including my home desktop and my laptop, so that I can maintain a consistent development environment across different machines. Unfortunately, the drivers for hardware on several of the machines are missing or broken.

WiFi on Macbook Air 2013

The first I noticed was the WiFi driver on my laptop (Macbook Air mid-2013). Because of this, I cannot use Mint on the laptop at all. The internet is so integral to programming that this is a real problem.

Sound card on iMac

The sound card on the iMac also was (and is) not working. After doing some research on the issue, it has no known fix. The other reports of the same problem have different symptoms reported by the system programs used to diagnose the problem.

https://www.linuxquestions.org/questions/linux-hardware-18/no-sound-modules-after-installing-mint-mate-18-a-4175593442/

From the research, it becomes apparent that nobody knows what they’re doing, and fixing the problem is a mixture of political red tape, lack of responsibility, and technical incompetence.

What the real problem is

There are two real problems here:

Too many options!

First, there are too many options for how to debug the problem, and none of them are simple. Linux breaks the UNIX manifesto: “Do one thing, and do it well, and make things work together”. Debugging a problem in Linux requires users to research too much. The commands that are relevant for debugging the problems do not do one thing and do one thing well. There are too many options for how to approach solving the problem. This is a fundamental design flaw in modern UNIXy operating systems and should be a warning to future OS designers on what not to do as a developer.

Too much reading!

The other problem is that Linux’s debugging process is command-based. Tools on the command line are terribly inconsistent in their interfaces. The syntax for using the tools is unintuitive and finicky, and the presentation of information is typically organized according to the mental handicaps of the developer and is often overloaded with details that are irrelevant to the user. This requires users to memorize commands, instead of providing a way to debug configuration problems by exploring and inspecting a visual representation of the system’s configuration. While the terminal is a uniform interface, the languages and syntaxes of the programs within them are very inconsistent and require too much reading to be used efficiently.

General principle: Linux is not brain-friendly

The general principle behind “Too many options” is that Linux is not compatible with how the brain learns to do things. Likewise, the general principle behind “Too much reading” is a combination of “too much memorization” and “too many irrelevant details”. The UNIXy command lines are hard on the brain’s auditory cortex, temporal lobes, hippocampus, and prefrontal cortex (PFC), and they do not make use of the visual cortex efficiently.

What do these pieces of the brain do? From Wikipedia:

The auditory cortex is the part of the temporal lobe that processes auditory information in humans and many other vertebrates. It is a part of the auditory system, performing basic and higher functions in hearing, such as possible relations to language switching.

The temporal lobe consists of structures that are vital for declarative or long-term memory.

Auditory Cortex

The temporal lobe is involved in processing sensory input into derived meanings for the appropriate retention of visual memory, language comprehension, and emotion association.

Temporal Lobe

The hippocampus (from the Greek ἱππόκαμπος, “seahorse”) is a major component of the brain of humans and other vertebrates. Humans and other mammals have two hippocampi, one in each side of the brain. The hippocampus is part of the limbic system, and plays important roles in the consolidation of information from short-term memory to long-term memory, and in spatial memory that enables navigation.

Hippocampus

The Visual cortex is powerful

The visual cortex is the strongest part of the brain for processing data. For example, the use of videos, demonstrations, and diagrams are very effective teaching tools, and the world around us is full of visual stimuli that must be processed quickly and efficiently in order to respond quickly and orient ourselves in a complex environment. The only way to do this is with vision. A simple test: walk outside and close your eyes, and try to find your way around for a few hours. It is incredibly slow and error prone.

Vision is much more powerful than other senses:

In the brain itself, neurons devoted to visual processing number in the hundreds of millions and take up about 30 percent of the cortex, as compared with 8 percent for touch and just 3 percent for hearing.

The Vision Thing: Mainly in the Brain, Discover Magazine

The UNIX command line and most programming languages are ineffective and slow tools, because they do not make use of the brain’s ability to process visual inputs quickly in parallel and contextualize large amounts of details into a cohesive idea. The visual cortex is the largest part of the human brain, occupying almost 1/3 of the mass of the brain. Humans evolved to process visual stimuli with massive parallel processing, so it is inefficient to have to sit and read one character at a time or even one word at a time. This bottleneck is partly why GUI’s were invented.

So, the next time you pop open your command line or a text editor, ask yourself, “do I want to use 3% of my brain or 30%?”.

Linux programmers are stupid

Because of the Linux community’s inability to grasp these basic psychological concepts, Linux will forever be crap. Linux programmers are so smart that they are in fact stupid. While they sit and worry about every detail of their software’s architecture, they ignore the brain’s architecture and its limitations and abilities.

Details

  • Operating System: Linux Mint 19.2 Cinnamon
  • Cinnamon Version: 4.2.4
  • Linux Kernel: 4.15.0-66-generic

28 thoughts on “Linux sucks, and there’s nothing you can do about it”

  1. Every problem in this post is entirely your fault. Carrying around a bootable SSD that you plug into multiple machines is a terrible idea, and if you’re trying to do something fancy, you shouldn’t rely on handholding-tier Linux distros. You very clearly don’t know what you’re doing (you can’t get wifi working on a Macbook Air 2013 — just install the broadcom drivers lmao) and you Google random details to look like an expert on a topic you don’t know anything about. Then you bring in some nonsense about the brain (completely irrelevant — more grandstanding) to claim GUI supremacy over a command line (because you’ve never found a decent use case for bash scripting).

    Stick to video games and schoolwork. Leave the discussion on operating system engineering to people who aren’t just tourists.

    1. OK, let’s put you back through school, since you have no idea what you’re saying: Carrying around a bootable SSD to use on multiple machines is actually an excellent idea. This is the ideal use case for a portable storage medium, because it carries all of your decisions with you: your software choices, your file outputs, your exploration, and it presents all of the files and exploration in its original context: using the OS that you used to find it. There is a reason networking hardware has generic drivers: it is supposed to follow a standard set of operations that can get it to a point where it can download more optimized drivers for the system.

      I tried Mint based on a suggestion. I was considering Arch Linux but decided not to use it because it is rolling release, and my previous experience with it was less than favorable. The wiki was great, but their handling of pacman and system upgrades was miserable. I got sick of reinstalling when it broke my system’s state with its updates.

      In response to “You clearly don’t know what you’re doing.” You clearly don’t know what you’re saying. I have plenty of general purpose computer science and sys-admin knowledge and as much or more Linux knowledge as a seasoned datacenter administrator. The only problem is mapping it onto a distro’s specific implementation. It is a massive headache to memorize or try to guess what random standard is going to be broken by any given distribution. The SUS (Single Unix Specification) is a complete farce today. Nobody uses it, because it was unable to scale up with the needs of system designers. And because of that, we have a re-incarnation of the Bible’s Tower of Babel: none of the specific knowledge is interchangeable or re-usable. And there are hundreds of distributions out there.

      In reponse to “Can’t get wifi working on a Macbook Air just install blah blah shit blah lmao”: No, you need WiFi on a device with no Ethernet to install a driver for WiFi. Cyclical dependency much? You’re suggesting I should probe the devices while booted to the macbook air to find the device model, and then shutdown and reboot the device on a different machine with hard-wire access to the internet, and go lookup the device name online for its driver kernel module’s name, which could possibly be wrong or outdated–or even undocumented because devs assumed you knew their distro was forked from another one so they expect you to magically know to go look in the instructions for the other distro. That suggestion has multiple points of failure and is garbage, because it is not a reproducible option: people who are learning Linux and have access to exactly 1 machine, like a younger version of me, would not have that option. The particulars of that distribution’s process for probing devices and researching which kernel module to install is not a procedure I want to memorize, as it will inevitably will become outdated in the future. Again, you and others that think like you don’t get it: memorization of text commands is a waste of time and an unsustainable practice, and therefore a dumb solution.

      Food for thought: the next time you pick up a rock and want to throw it, I want you to think what command you were supposed to give it. Computers are supposed to be indistinguishable from the objective world. They present an interface, just like that rock, however unsophisticated it is. If you want a nicer rock, you change its shape and structure to make it fit your human needs. With a computer, it is exactly the same idea. You should not have to memorize arbitrary strings to interface with a computer, because we live in the 21st century. The interface on a computer should be self-explanatory and easily acquired, like a language. Inside the brain, strings are declarative explicit details, not procedural implicit concepts. As a computer is inevitably going to have to be interfaced with by a human, we should cater to the limitations of human cognition and learning. So, a well-made interface makes it easy for your brain to extrapolate the details based on previous experience with similar details. For example, this is why your brain still recognizes the elements of a GUI or game after you apply nightmode or a skin.

      ex·trap·o·late
      verb

      • extend the application of (a method or conclusion, especially one based on statistics) to an unknown situation by assuming that existing trends will continue or similar methods will be applicable.
        “the results cannot be extrapolated to other patient groups”
      • estimate or conclude (something) by extrapolating.
        “attempts to extrapolate likely human cancers from laboratory studies”

      Mathematics

      • extend (a graph, curve, or range of values) by inferring unknown values from trends in the known data.
        “a set of extrapolated values”

      from the Oxford English Dictionary

      In summary, as a programmer, you are designing an interface for users (end-users or downstream programmers), so quit trying to avoid taking responsibility for the user experience, and don’t shunt the responsibility from the user’s procedural memory to their explicit declarative memory. It’s lazy, and irritating.

      Remedial learning: https://www.livescience.com/43713-memory.html

  2. Welcome to linux world.

    You have to accept to get something not perfect, and to contribute to fix it.

    If you don’t do that, you’ll be considered as a stupid linux guy, who do not fix problems of others… As you’ve just claimed ! 😀

    1. But you are missing the point. You can’t fix something when everything you depend on drags you back down into the problems and complexity of the underlying platform’s design. In this case, the platform is the C language. We are continually being forced to come face to face with the C language via the foreign function interfaces (FFIs) in languages that rely on system functions that are only provided in C. C has major problems: too much complexity, too many lines of code required, and lack of intuitive coding patterns that result in pernicious bugs that go undiscovered for decades, and the unnatural practices that are required to develop in C lead to a technical debt down the road when yet another human has to figure out what strange unintuitive strategy was used in the code. If you create a new language, this doesn’t solve the problem. It multiplies it, because you are required to know C and the new language in order to make use of high-performance libraries and existing work done in C.

      We are stuck with C because it is popular. It is not popular because it is good. The problems with C permate every other language built on top of the OS that uses it. In the case of Linux, it is because Linus Torvalds doesn’t like the idea of updating to a more digestible, expressive, and discoverable language or tool chain. We are constantly being dragged back down into the problems associated with C. It’s infectious. There is an economics concept that describes this perfectly: the network-effect. This is the same effect that keeps Facebook in business. People use Facebook, because other people use Facebook. If other people didn’t use Facebook, Facebook’s perceived value would disappear.

      It may be naive to assume that you as a programmer are going to contribute every line of code, but you can make it so that transitioning your expertise to another domain of knowledge is efficient, by choosing better systems as the base of a technology stack.

  3. Quite useless read. Life sucks. So what?
    You think this complaining will fix one thing?
    Have you considered contributing?
    How much did you pay to get your specific use case supported?

    1. Have I considered contributing? Yeah plenty of times, to plenty of projects. Do I have the patience to acclimate myself to their preferred combination of tools/languages and sit and decrypt their particular style of coding and navigate through all of the research tangents that it will inevitably send me on to memorize and validate their code? No. For free? Certainly not. There are easier ways to inflict pain on myself 😂 Here’s a thought: if the development process were simpler to navigate and become oriented to, people would be more likely to contribute for free to making it better and more navigable. It’s a positive feedback loop. Nice begets nice. Shit begets shit.

  4. So you are complaining you do not know how to install linux. Quite ignorant. I installed linux myriad times with dofferent setups and never had problems like yours. If you don’t want these ‘issues’ (namely learn how tools work) then stick to mac or windows, you pay for those systems to do everything for you. Linux by big part is community efford, either you can appreciate this or go elsewhere.

    1. Actually, you’re misreading my blog post and obviously did not read it for understanding. Quite ignorant. I’ve installed Linux many times before this. Perhaps the word “bug” did not tip you off enough. Need I be more obvious?

      Like the other troll commenters, you are a coward hiding behind a fake email and name:
      meli7@wp.pl invalid. checked on https://email-checker.net/validate
      IP: 91.189.61.206 Your location: Warsaw, Poland https://www.speedguide.net/ip/91.189.61.206. More like Trolland

  5. That’s kind of weird that you’re complaining about Linux not working on the most closed-source platform in existence.

    1. I’m too stupid to use an installer and it’s all the developers fault.
      If you needed more control over the OS instead of absolutely borking the company’s equipment with your ineptitude why didn’t you request an administrative account?
      I genuinely hope they took notice of what you did and took the appropriate action of terminating your employment

      1. Paul: There is this thing…called reading–which you clearly did not do; and you didn’t punctuate correctly, either. And you posted a reply to another person’s comment, when you clearly meant that comment for me, not the other person. You’re just an all-around idiot.

        I cannot request an administrative account. I clearly stated I am a student at a University. This error lies squarely on the programmers’ shoulders. They hard-coded a bad assumption into the installer, which installed it in a nonsensical default location, and they failed to give me the option to manually create an LVM-based configuration. This is why I do not trust Linux programmers: their obvious lack of concern for reasonable defaults.

        Case in point: NixOS attracts no community because of their ideological refusal to set reasonable defaults and create a facile environment. They make it harder to use the OS just so they can push their agenda of using FOSS software. It is an inappropriate place to push that agenda. I support FOSS but I don’t want my OS obscuring non-free drivers that are absolutely required for the OS to function properly. Users need non-free software. It is not within the user’s control to change the state of the industry. Forcing the user to spend inordinate amounts of time to circumvent red tape is very disrespectful, and only makes the user hate your community! Their policy is extremely egotistical, which is precisely why their ideas are not being adopted by the mainstream. They have some good ideas, but they’ve polluted the message with their bullshit behavior.

    2. OK there is an ambiguity in your sentence structure, but obviously you don’t know what platform I am writing this on.

      Yes Apple is closed-source. I’m not complaining about Linux not working on a specific closed platform. I’m complaining about the programming profession’s problem with keeping things simple for people to learn and acquire. Languages are acquired, just like natural languages. After you learn it, you think in it. I use the term discoverable alot, and none of these languages are discoverable. Linux just happens to be ground zero for this shitstorm. Linux also portrays itself as an escape path from the closed-source jungle of Windows and Apple, but it presents more complexity and uncertainty than it is worth, and its developers provide no reasonable way to manage that complexity.

  6. Pingback: New top story on Hacker News: Linux sucks, and there’s nothing you can do about it – World Best News

Leave a Smart Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.