2.2. Introduction to Linux#

A Linux distribution such as Ubuntu, provides a complete OS. Operating systems are the system software that manage the way applications and users interact with the computer hardware. In large part, this means they provide the abstractions necessary for applications to run smoothly. Basic things, such as making a file available under a name (instead of starting at sector number 187,374), but also being able to run multiple applications at the same time on a single processor, or even multiple users without hindering each other.

Probably the operating system you are most familiar with is some flavor of Microsoft Windows, but many more exist. While current versions of Windows are very capable as a Desktop OS, many believe that it is lacking in other areas, particularly for use in embedded systems such as robots. As you can imagine, an OS for word processing and playing games has rather different requirements than one for running a robot, which often doesn’t even have a display!

For robotics, Linux is the most popular OS, but it didn’t arise from nothing. Its origins date back more than forty years to the Unix operating system, and even now many things that you will learn in this course can be applied to other members of the (rather large) Unix family.

../../_images/Tux.svg

Fig. 2.1 The official Linux mascot is the penguin Tux.#

Image credits: lewing@isc.tamu.edu Larry Ewing and The GIMP, CC0, via Wikimedia Commons (source).

2.2.1. Brief history#

In the 1960s, computers couldn’t really talk to each other. Even computers by the same manufacturer were often incompatible. Furthermore, the operating systems that ran on these machines often only did limited things, such as reading and writing paper punch cards. All programming had to be done in machine language, which is a tedious job.

In 1969, a few of the researchers at AT&T Bell Labs in the US (Dennis Ritchie, Ken Thompson and some others) created an operating system for the DEC PDP-7, a cheap computer (at the time) which had just 4 kilobytes of memory[1]. So, they put together a very simple basis for an operating system[2]. It consisted of little more than a shell for typing in commands, a mechanism for starting and stopping programs, and a mechanism for the programs to talk to the shell. The researchers called their toy operating system Unix as a pun on MULTICS, a predecessor that was too big and complex to run on the PDP-7.

In 1973, Dennis Ritchie and another researcher, Brian Kernighan, developed a new computer programming language called C (as a successor of B). Unix was translated to C, and after that it really took off. Before C, if you wanted to run Unix on a new computer, you had to completely re-program it in the machine language of that computer. Now, all you had to do was to build an assembler for the C-compiler, re-compile Unix and you had a working operating system. This is called portability.

As Unix became more popular, researchers at universities around the world started using it. Unix was distributed in source form, the original C code. This meant that the researchers could repair bugs[3], change programs or even add functions to the operating system themselves. They would send back these changes, and over the years Unix quickly grew to contain a large set of commands. One especially important branch of Unix was made at Berkeley University in California; they put in the important TCP/IP networking code, which is the protocol computers use to talk to each other on the Internet. This branch is still alive today in the FreeBSD, NetBSD and OpenBSD operating systems. The networking code even ended up in Windows 2000.

While the code of Unix was available to its users, it still required an expensive license. In 1983, Richard Stallman, a computer scientist at MIT, announced the GNU project (for “GNU’s Not Unix”), which would provide a complete reimplementation of Unix as free software. Although the project created a C compiler, text editor and many of the required utilities and libraries, it lacked a stable kernel, the part of the OS that manages how different programs work together. This triggered the Finnish computer science student Linus Torvalds in 1991 to create Linux, which is a free operating system kernel, originally for the popular Intel 386 processor, and includes the GNU C compiler and utilities. Many people contributed, and the combined GNU/Linux[4] OS is now the most popular free operating system available, coming in a wealth of different versions (called distributions).

2.2.2. Free Software#

Linux started as free alternative to Unix . The Free Software Foundation (FSF, founded by Richard Stallman) defines Free Software as follows:

Free software is a matter of the users’ freedom to run, copy, distribute, study, change and improve the software. More precisely, it means that the program’s users have the four essential freedoms:

  • The freedom to run the program, for any purpose (freedom 0).

  • The freedom to study how the program works, and change it to make it do what you wish (freedom 1). Access to the source code is a precondition for this.

  • The freedom to redistribute copies so you can help your neighbor (freedom 2).

  • The freedom to distribute copies of your modified versions to others (freedom 3). By doing this you can give the whole community a chance to benefit from your changes. Access to the source code is a precondition for this.

This kind of freedom is often called free-as-in-speech (where free refers to your rights as an individual), in contrast to free-as-in-beer (where free just means it costs you no money, as when someone offers you a drink), because the user is free to use the software however they want it.

Ed Felten, a professor at Princeton, effectively coined it as the Freedom To Tinker. While many people simply like the fact that they don’t have to pay for their software, the FSF sees the use of free software as

… to make a political and ethical choice asserting the right to learn, and share what we learn with others.

Essentially, this makes free software a lot like academics, building on the work of others to create new innovations. Bigger free software projects, such as the Linux kernel, can have many thousands of contributors working together, often in a “Bazaar” style of loosely organized chaos[5].

2.2.3. Architecture#

As hinted at before, an OS is not a single piece of software. The kernel is the central part, determining when each program or other system process can run, and making sure they can communicate while not corrupting each other’s memory. Drivers talk to the hardware, such as a keyboard, display or hard drive. They can be part of the kernel (which is then called monolithic), or be separate system processes, in which case the kernel is a microkernel[6]. System utilities are the programs necessary for the most basic functionalities, such as a shell, commands to move about the file system, create new files, compile programs, etc.

Most Linux distributions also provide a Graphical User Interface (GUI), which itself consists of several subsystems, such as a windowing system and a window manager. A windowing system manages allows programs to draw in square regions on the screen (or screens, if you have multiple monitors). A classic windowing system that has been around for decades is the X Window System, or X for short, though in recent years Ubuntu has switched to a more modern windowing system called Wayland. The windowing system only provides drawing and pointing facilities, but does not specify how things such as window borders should be drawn. That is the task of the window manager, of which there are many, some of which even especially designed to be used without a mouse![7]. A desktop environment then integrates a window manager with a suite of applications that share a common interface, such as a calculator, text editor, email client, etc.

2.2.4. The Unix philosophy#

As the description of the GUI exemplifies, Linux is actually a combination of many different programs working together. This is not specific to the GUI, but permeates many Linux systems, and originates from the early minimalist design concepts of Unix .

The Unix philosophy[8] captures these design principles:

  1. Write programs that do one thing and do it well.

  2. Write programs to work together.

  3. Write programs to handle text streams, because that is a universal interface.

The idea is that each program acts as a tool, and the user doesn’t have to understand how it works internally (it can be considered a “black box”), as long as the behaviour of tool is as straightforward and transparent as possible. This gives rise to small modular programs, each doing well-defined tasks, which can be combined in many different ways to produce more complex behavior.

The Unix philosophy will become especially clear once we shall explore using the shell, a non-graphical text-based interface which is at the core of all Unix and Linux systems. The shell interface and most Linux system utilities handle text-based input and output, in line with the last principle of the philosophy.

2.2.5. Distributions#

Linux runs on many different hardware platforms, from dishwashers to DVD players, mobile phones[9], webservers and PCs.

Each platform has its own requirements, and even PC users all have different wants and needs. Since Linux is free open source software, volunteers and companies alike started creating their own selection of the kernel, system utilities, desktop environment, default programs, optimizations, etc., specific to their needs. Such a complete configuration of a Linux system that is provided a complete OS is called a Linux distribution, and nowadays many distributions are available.

Popular distributions include:

  • Ubuntu, which stresses the desktop user experience.

  • Debian, which stresses stability and is popular for servers.

  • Red Hat Enterprise Linux, a commercial variant that offers service and support.

  • Elementary OS, which tries to emulate the macOS desktop design.

  • Slackware, for advanced users and those that want to learn the inner workings instead of the graphical veneer.

  • Buildroot, for embedded systems where memory space is limited.

Often, distributions are based on other distributions, building upon them by adding specific features, different optimizations, other default programs, and/or by providing commercial support. For example, Ubuntu is based on the Debian distribution, but is itself the basis for other distributions such as Elementary OS. In practice, this means that a lot of online help for Debian and other Debian-based distribution also applies to Ubuntu, which can be useful to know when searching online for support and information.

At the TU Delft Cognitive Robotics department (CoR), most systems run Ubuntu because of its user-friendlyness, good community support and advanced package management (which it shares with Debian, on which it is based). Where this manual is distribution-specific, it will therefore be based on Ubuntu.