Linux vs Windows

Disclaimer

  • This article is meant to give my personal opinion on the Windows and Linux operating systems, information in this text are based on my personal experience and how I understand things, I grew up with DOS and Windows but use Linux systems in my system administrator job at the CS department of our university
  • This text is not meant as a complete comparison of Windows and Linux, it describes what I think is better done in Windows than in Linux</li>
  • If you’re reading this you might think “Come one, please wake up!” that’s fine, please let me know exactly where I went wrong
  • I hope I didn’t state anything that’s not true, but if this is the case, let me know
  • When refering to Linux in this article I usually mean GNU/Linux or Unix systems in general

Introduction

Linux is looked at as if it’s the future, it’s stable, scalable and cheap. Beside that it’s also a simple OS, there are basically two things: files and processes, you can look at everything in your system as a file or in case it’s not a file, it will be a process. Tapestreams, your mouse, they all can be accessed as if they were files. Windows is much more complex, there are drivers, files, directories, services, a registry and who knows what else. The obvious question is, what’s the better kind of system? There are pro’s and con’s for both. The linux/unix way is obviously simpler, however the Windows way standardizes the system better.

Services

In Linux there’s the convention to put all scripts to start and stop daemons (which are just programmes which run in the background) in /etc/init.d, but they could equally well be put anywhere else on the system, you have to make sure this convention is kept intact yourself. In Windows there are services, if you want to use a service you will (or usually: the setup programme will) have to register it with the system and then you can manage it from the Services Management Console.

Configuration

Configuration files? In Linux systems they can be put anywhere, but usually they reside in /etc. They are plain text files, each with a different syntax, which is very inconsistent. In Windows there’s the registry, the central place to store all your application settings in a structured way. Usually those settings are exposed through some GUI (Graphical User Interface) to make editting easier. Windows also has the MMC, Microsoft Management Console, an uniform way to edit system settings. Any application developer can write so-called “Snap ins” which provide an easy and consitent UI for managing everything from services, disks, users, SQL databases and whatever. In newer technologies such as ASP.NET configuration can be stored in configuration files in XML format. This is to make application deployment easier (it’s not great to let a sysadmin run a setup programme every time you change your web application).

Scripting

Linux offers numerous great scripting languages, bash, perl, python, you name it. They can be very valuable to make it easier to create users, add websites, backup data etc. But from my experience this is still quite primitive, in order to edit configuration files using scripts you have to know it’s format, apply regular expressions to extract the right info from them and so on. this is quite error prone. Windows offers the Windows Scripting Host. Applications can expose parts as COM objects, COM is the predecessor of .NET (well sort of). Using WSH you can gain access to user accounts, IIS, Word and any other application that uses COM. This mean you can just instantiate for example an IIS component and use it as it were just a regular library. Scripts can be written in VBScript or JScript by default, but I believe you can also use Perl and some other languages.

Installing/uninstalling software

Each distribution has it’s own software manager in Linux, most use RPM but others use DEB or TGZ. If you’re lucky there is a distributable of the software you’re looking for in your package manager format, if not, you’ll have to compile it yourself. If you’re lucky, all goes well, if you’re not you’re probably have to find out why it didn’t compile, are all libaries in place or do I have to symlink some libraries to get it to compile? When you install software pieces of it go everywhere: Configuration files go to /etc, binaries to /usr/bin, data files to /var. This makes uninstalling software pretty hard, if you didn’t install your software using a package manager that tracked where each file went, you have to find the files yourself. In Windows software is usualy distributed in a self extracting setup executable. Just run it, choose your installation options and you’re set. Uninstalling for all software is done through the Add/Remove software option in your control panel.

File system

In Linux you have many choices for a filesystem. There’s ext2, ext3, ReiferFS, JFS, XFS and many more. Ext2 and Ext3 (which is just Ext2 with journaling) are the most used. In Linux I always liked the single root idea. The world would start in / and different disks and devices can be mounted underneath that. In Windows there are only two filesystems you can choose from (effectively): FAT32 and NTFS, the latter is prefered. Windows uses multiple roots, usually one root per device. There’s no real drawback to this, nor has it many advantages that I’m aware of. Only recently I found out that it’s actually possible to mount a drive into another NTFS drive as a directory, i.e. I could use mount my D: drive (which contains media) to C:media so it would just look like D: was a subdirectory of C:. This way you can emulate the single root Linux has, may be useful from time to time.

Linux offers hard links and symbolic links. Windows NTFS only offers hard links, but they are barely used for as far as I know. Both Linux and Windows use groups and users for permission management. Linux has one root account which has permission to anything, Windows has an Administrators group, all users in this group can do anything. Permissions can be set on a per file/directory basis on both systems. Windows offers a bit more flexibility in this area. It allows you to set permissions for different groups and users on a single file or directory. Files/directories in Linux belong to one user and one group, you can only set permissions for that user, that group and the rest of the world. Permissions on directories in Windows are automatically inherited by child files and directories unless a child file or directory explicitely expresses not to inherit them, in Linux you have to recursively change the permissions if you want to give a whole directory and all files and directorie within it different permissions.

Performance monitoring

In Linux it’s very easy to see how much the server is loaded and which processes cause this load, it’s also pretty easy to kill those processes. In Windows that’s also pretty easy. Windows however goes a bit further than that. In the MMC is offers the performance monitor, you can set up some counters you’re interested in (memory usage, disk IO, queries/second for SQL server) and it will show them in graphs or reports, this can be very useful.

Graphical User Interfaces

To me GUIs are the next step in computer evolution. Computers started with just a lot of switches, then when terminals came available keyboards and screens were introduced. Commands were then given by typing in cryptic commands (rename blafile.txt ….differentfile.txt). Then the graphical user interfaces came which made life so much easier. I’m still not sure if GUIs are always a better solution than typing in commands. For novice users it is of course, but I can imagine that there are system administration specific tasks that can be better accomplished through a console. However, in Linux you must know how to work using the console, if you can’t, you’re practically dead. Although there is a graphical windowing system available for Linux: X, most distributions allow you only to set a part of all available settings through GUIs, the rest has to be done either by editting text files or running commands. But undoubtly, this will improve over time.

Come to speak of X. It might be a nice system, having all those window managers such as GNOME, KDE, WindowMaker, BlackBox, it surely empathaises the freedom of choice, but it causes applications developed for one of the specific window managers to look really bad in other window managers. If you run KDE and startup GNOME software such as GAIM or the Gimp you’ll notice the menus look different, buttons look different, scroll bars look different. This makes Linux look pretty bad on the desktop or you should banish all software not specifically written for your window manager, but that’s not feasable in most cases. In Windows everybody uses the same interface and every application looks about the same.

Freedom of choice

Basically, Linux is just a kernel, the very least binary that your system needs to run. To successfully use it you need a bunch of tools. But as Linux does not dictate any file structure or which tools you should or should not use, you can use whatever you want, isn’t that great? Personally I say no, it isn’t. Most people don’t want to choose between 20 Linux distributions, 50 e-mail clients, 10 browsers, 6 e-mail servers, 2 boot managers and 20 window managers. If I learn how to use one Linux system there is not guarantee or whatsoever that I can apply what I’ve learned to another PC running Linux. Sure, most Linux distributions are roughly the same, but the way configuration files and package management are handled is quite different between for example Redhat, SuSE and Debian. Because there’s practically just one Windows distribution these days (beside the different older and newer versions) they all work the same. Because there are no policies on how UIs should be ordered and how the screen should look many applications look very diffent than others. The Gimp for example uses different windows for all it’s components, Blender3D has quite an alternative UI too. Combine that with the different way controls look in different applications and you’re sure to make novice computer users run away screaming.

Security

Security has been a big issue in particular in Windows since the growth of the internet. Security however is one of the most important things for Microsoft at the moment (or so they say), if you look at Windows Server 2003 it’s much more secure (everything disabled by default) and has way less security bugs than predecessors. But it still takes some effort to secure Windows. Linux isn’t always safe either, only recently a bug in the kernel 2.4 was found that allowed an exploiter to gain root access to the server. When a security bug has been found in Windows it’s big news, if the same happens to Linux you don’t hear anything unless it’s very bad.

Many problems in Windows are caused by the fact that most users log into Windows under the Administrator account, that means that you have access to all files and can change everything. That means that if you get a virus in your mailbox, it has access to all files and can change everything. In Linux there’s the policy not to login to the root account ever. Only briefly switch to the root user when you have to do administrative tasks such as installing software or editting settings. Usually you’re logged in under a normal user account wich has only very limited access to important files. This of course is much more secure and if Windows users would do the same, that would solve a lot of problems.

Windows as a server

In the server market, and the webserver market in particular Windows still is a relative small player. Recently Rackshack has begun to offer Windows dedicated servers too, even starting at lower prices than Linux machines. Personally I look at this as a good sign for Windows. People have faith in running Windows as a web server.

I have downloaded the Windows Server 2003 Evaluation CD, which you can download and test out for 6 months free, and installed it on my laptop. When you install it, it doesn’t do much, very little software is installed, no webserver, no terminal services, no ftp server, no file server. In order to use the server for that you have to assign roles to it, such as “Application server”, it will then install IIS6. I think this is a good approach. Servers should not have stuff installed that you do not need and that can potentially open security holes in the system.

An issue with Windows Server is it’s licensing. There are four editions: Web Edition, Standard Edition, Enterprise Edition and Datacenter Edition. The Web Edition is the cheapest, it costs about €390. However it may only be used for servicing websites and other web related services such as e-mail and ftp. The license forbids to for exapmle run IRC servers or Quake servers, also you can’t install SQL Server on it (other database servers are allowed). If you want to do all that, you have to buy the Standard Edition, which I’ve seen for around €670. On the standard and higher editions you can run whatever you want, but you have to deal with CAL licenses. Only a certain amount of people can be connected to certain services at the machine (such as terminal services, used to login to the machine remotely), I believe the cheapest version comes with 5 CALs (Client Access Licenses), so 5 users can connect simulataneously (this does not include web traffic by the way, thank god). If you want to let more people connect you have to buy additional CALs. In a webserver scenario this is not a problem I think. But it’s something you have to think about, you don’t have these issues with Linux, on Linux you can install whatever you want and connect as much to the server as it can handle.

Is there hope for Linux?

Can Linux improve over time, can things become more consistent, can applications made to look the same, will there be one UI libary that everybody uses? Theoretically that’s possible, but in practice quite unlikely. Linux distributions that chose to offer less choice in software as a first step to accomplish consistency were yelled at. Linux is all about freedom of choice, everybody should be able to choose what he/she wants to choose.

And the configuration files, well do you think that you can rewrite all software, or can make all their authors to rewrite their software so they use the same syntax (XML for example)? I highly doubt it. Because of it’s distributed development it’s very hard to retain consistency in how similar problems are solved. The only way I see is to centralise development in one single company, but that was what the Linux people wanted to get away from in the first place. It is also very hard to implement new technologies. Take the .NET CLR for example, because Microsoft thinks it’s the way to go, you can be sure that many applications on Windows will be using or will be written using managed code. Mono is now porting .NET to Linux, do you think it will make a huge impact there? Do you think Mono will be supported from the core of the OS? I’m sure it’s not, nothing like that happened with Java. You can argue that this is because both are not open source technologies, but I highly doubt that it would’ve worked if they were. What if a new file system or way of storing data was developed for Linux, like is happening with WinFS in Windows, how long would you think it will take to have it accepted and supported throughout the system, if ever?

So Linux bad, Windows good, right?

No, Windows is not always the best solution. If you want to use an old PC as router you’re probably better off with a one disk Linux distribution which does just that. Performance of Linux in many cases is better too. There are probably many other scenarios where Windows is not a good choice. In a huge server cluster with dozens of processors you probably don’t want to run Windows because of it’s licensing costs, Linux doesn’t have all that.

A problem with Windows is that it’s closed source, if you find a problem with the OS you can only rely on documentation (from Technet for example), you can’t dive into the code to see what goes wrong. It’s a black box, you don’t have this problem in Linux, this is seen as a big advantage of Linux. You could wonder how many Linux users actually would ever reas source code, or could actually understand the code, but anyway, it’s possible.

A very good thing of Linux is that it’s competing Windows, something Windows users benefit from too. Windows Server 2000 costs over €1000 (or used to), the comparable edition of 2003 is around €670. Prices are dropping, which is a good thing.