Recently I stumbled across Matthew Moore’s Mythbusting Linux video on YouTube. Normally I find his videos entertaining and informative. However on this occasion he got quite a few points wrong or just misunderstood. I would have commented below the video but comments were disabled. This isn’t an attack on the video but rather to provide constructive and hopefully informative feedback on some of the issues raised.
I shall address the points in the same order that they appear in the video. Firstly the site that Matthew refers to, Why Linux Is Better, which he then goes on to debunk, is rather old. A quick look at the site gives a date of March 2003! Wow, that’s ancient history. However a lot of what it says still holds true.
Interestingly if we look at Matthew’s example we see that in actual fact Sophos has detected some malware in files named
tekdefense.dll. These are MS-Windows files! What they are doing on his system is anyone’s guess. Perhaps that program that wouldn’t run isn’t as reputable as he thought.
I’ve blogged about this in the past. No computer system is immune from malware unless all the software on it is attributable from a reputable source, isn’t connected to a network and users don’t have access to it. None of which is frantically useful. All computer systems are potentially vulnerable, some more so than others. Some points to consider:
- Programming APIs. These are the tools that programmers can use to get the job done when writing software. On MS-Windows these APIs provide many flexible features. Too many in fact, as some can be put to bad use. For example, injecting code into other processes and then running it. Yes you can do the same under Linux (assuming it hasn’t been locked down), but it’s much harder to do. This cool aide problem, as I call it, also extends to application and desktop facilities. For example, who thought it a good idea to allow Outlook to execute Basic script code in emails by default? Yes it’s an old vulnerability but it shouldn’t have happened in the first place. Linux doesn’t suffer from this anything like as much.
- Use of administrator accounts. Historically people always ran everything as Administrator under MS-Windows because of legacy and badly written software. Under Linux you are always taught to run stuff as an ordinary user and software is always designed to support this. Most distros won’t allow you to run desktops as
root. MicroSoft are doing their best to address this with the introduction of User Account Control, or UAC for short.
- Currently there are very few viruses specifically for Linux, not least because it’s harder to proactively spread from one machine to another. Most of the Linux malware out there consists of bots, trojans, the odd root kit and exploits. Of course Linux has it’s fair share of vulnerabilities, as does any operating system, so it’s always a good idea to keep up to date with patches etc. Remember virus checkers only catch what’s known about (be it behavioural or dumb signature).
- Invariably malware detected under Linux is actually designed for MS-Windows and is picked up by virus checkers which predominantly have signatures for MS-Windows viruses.
So let’s see what Sophos have to say on the subject of Linux. If we have a look at this blog page, they raise a number of points:
- Linux Is Not Invulnerable Nor Virus Free – Ok RedHat and Ubuntu have removed claims about being virus free. But do remember that RedHat has been security accredited by NSA and the UK’s CESG awarded Ubuntu for being the most secure end user OS. Vulnerability comes down to frequency of applying patches. When those two vulnerabilities, Heartbleed and Shellshock, became public my systems were already patched against them. As for those two items of malware, well is that the best they can come up with? Stuff that is years old. In reality there has been very little fuss over this malware on account of their rather feeble infection and spreading techniques. Indeed they usually come as unintended extras in hacking and bot-net kits. However no system is invulnerable. As for phishing emails etc, that’s more to do with user education than anything else. When faced with a scam, not much will protect you other than your wits.
- Low Market Share – See my points above. Android, whilst based on Linux is a very different eco system and so it’s a bit unfair to lump them together.
- Cross Platform Malware – Agreed. See my points above. Odd that they left off Wine though.
- Official Software Repos – This is a bit misleading. Yes one can add unofficial repos, but that is not really a fault with the system, just how one may be using it. Official repos are cryptographically signed along with the hashes of all packages contained within. Also initiatives like Debian’s reproducible builds will help to strengthen the integrity of official repos.
All in all a rather unalarmist summary from Sophos, more credit to them.
So at the moment if you keep up to date with security patches and use plugins like NoScript to block unfettered access to your web browser by internet sites then there is currently little to no credible virus threat aimed specifically at Linux. However this will change. At the moment one can use good systems administration practices and common sense against the threats.
Should you run a virus checker? It depends. Personally, like Matthew, I also run Sophos on Linux. Not because I’m worried about getting infected with some Linux malware, but because I don’t want to infect MS-Windows machines with viruses that may be residing harmlessly on some Linux storage somewhere. It’s about being responsible and thoughtful to MS-Windows users. Sophos supports automatic updates, cloud based protection techniques and on access scanning (all of which can be changed or switched off if need be).
A part of Matthew’s point seems rather confused here. At one point he is saying why have these programs if you don’t need to defragment Linux file systems and then goes on to explain about how they report fragmentation.
e2freefrag can report on the state of fragmentation of files and the free space on a file system respectively, but they can’t do anything to change it.
He is correct about
e4defrag. Ext4 file systems are different from their predecessors in that they support a mechanism called extents. These enable the file system to manage house keeping data more efficiently by simply storing the starting block number and length of any contiguous sequence of disk blocks belonging to a file instead of indexing each block individually or in small clusters. Thus one needs to make sure that most files remain unfragmented in order to maintain the efficiency gains, and this is why Ext4 has a defragmentation tool.
Obviously bad fragmentation can occur, especially when a file system gets full, but generally Linux will do it’s best to keep fragmentation to a minimum. In the twenty years that I’ve been using Linux I have never had the need nor inclination to defragment any Linux file system, including Ext4 ones. They have never needed it, whether these be machines at home or the large servers at work. So as for saying that you need to defragment your file servers once a month, well that just isn’t the case.
Of course when you patch a system with updates, the chances are you will have to reboot. That goes for any system. However unlike MS-Windows, which requires several reboots for a batch of updates and quite often software installations as well, Linux only needs one reboot after updating and hardly ever resulting from a software installation.
The real point being made here is about stability, not that you will never have to reboot when doing an update. If Linux servers are left to their own devices they will happily keep on running for years without needing a reboot. Obviously, for the sake of security, periodically you should schedule an outage and apply updates. This is certainly not the same thing as a system crashing or having to reboot it to free up system resources.
Some core Linux systems at work haven’t been rebooted in years. In fact the hardware and power supplies that they run on are less reliable than the OS. I hasten to add that these systems aren’t connected to the internet though, so patching isn’t such a necessity.
Also Linux updates come as one set via one system update mechanism. On MS-Windows each third party application has it’s own installer and update mechanism, yuck.
Linux Won’t Boot When Switching Disks
Err yes it does! But you need to use a generic kernel configuration.
Firstly if you’re switching disks, presumably you’re removing the entire disk with the boot loader and operating system on the one disk? If so then the UUIDs referenced in the Grub configuration will match those of the file systems that you wish to boot off. If you’re mixing and matching different boot loader configurations with different file systems then you’re going to have a fun time of it.
Incidentally using UUID file system identifiers is simply the default option. You can turn that feature off and use normal
/dev/sda1 partition designators. The choice is yours. The only time you actually need UUID designators is if your system dynamically maps disk drives to different device designators on each reboot (for instance the Raspberry Pi can suffer from this if you’re using external USB disks).
Whilst I don’t go swapping disks around from system to system. I have backed up one system and then restored that system image on a completely different machine with different hardware. It worked fine. Ok because I use kernels with just the drivers needed to boot the system, on first boot I had to use Ubuntu’s rescue disk and update the kernel’s bootstrap drivers by doing
update-initramfs -u, and then it booted up normally. About the only similarity between the laptop and desktop systems in question were the 64 bit processors.
Looking at Matthew’s video I’d say the most likely reason for the boot failure was that his kernel’s
initrd.img file didn’t contain the necessary drivers to access the disk itself (laptop and desktop chipsets are often subtly different even if they look similar enough). Remember Grub independently loads the
initrd.img files and doesn’t rely on any kernel drivers to access the disk.
You fix this issue as described above with the
update-initramfs -u command. However if you’re going to move the disk from system to system then it would be worth rebuilding the
initrd.img file with most available drivers rather than just those needed for that particular hardware. The downside is that the kernel will take slightly longer to boot.
When installing Linux, quite often some installers give you the option of a generic kernel or one suited to your specific hardware. This is what it was talking about.
Not only have I transferred systems from laptop to desktop systems but also from inside a VM to real hardware and the other way around with no issues beyond using
update-grub and sometimes
grub-install (depending upon how I transferred the boot loader).
Anyway I hope this has helped to shed some light on the issues raised in the video.