The operating system is the medium through which the user-level applications can be interfaced with the hardware of a computer. This section will look at the differences between the Windows OS and Ubuntu (a Linux based OS).
Essential Elements of Operating Systems
An operating system is known for its Convenience (Its Ease-of-use), Efficiency
(How well does it exploits the hardware resources available) and its Ability to
evolve (Its adaptability to the frequently changing hardware technology).
An OS’s ability to evolve is arguably the most important feature of an OS. For OS to be
upgraded every so often without the need to reprogram it completely or making changes
where no change is required an OS must be:
- Modular, with clearly defined interfaces between the modules
- Well Documented
The need for detailed documentation arises from the fact that most stable OS’s have tens of millions of lines of code with more than one programmer working on it. The lack of proper documentation makes it extremely difficult to make changes and debug unforeseen problems in an OS. Both Windows and Linux have a modular design where some part of the system core will reside in independent files, called modules, which can be added to the system at runtime. Ubuntu has a Monolithic kernel (Linux) where all services (Virtual File systems, device drivers etc.) and core functionalities (Memory allocation, Scheduling etc.) share the same space and therefore in Linux’s modular design, the modules are inserted to and runs from the same space that handles the core functionalities (i.e. Kernel Space). Windows on the other and is based on Microkernel architecture where the core functionality is separated from system services and device drivers. Each module in such an architecture is an isolated process when run.
The user convenience provided by OS depends on:
- Program development services (e.g. editors & Debuggers)
- Program Execution (Fetching Instructions and Data from the memory, Carrying out the Instructions in Scheduled order etc.)
- Access to I/O devices (Recognising the type of I/O device, running the appropriate device driver etc.).
- Controlled access to Files (Reflecting on ‘the type of I/O devices to store and read data from’ as well as ‘the structure of the data’ the OS must provide access to the files)
- System Access (The access to the system files must be secured and conflicts on resource contention must be resolved by the OS, especially in case of share systems)
- Error Detections (Different kinds of errors occurring in the system must be detected and resolved by the OS with assigned priority)
- Accounting (The OS must register the systems performance statistics which can be used in determining future enhancements and managing processing power provided to overtaxing components).
OS Evoltution over the years has gone from:
- Serial Processing (Directly interfacing the hardware) to --
- Simple Batch Systems (Relying on the monitor software that schedules and assigns different jobs to the processor.) to --
- Multiporgrammed Batch systems (With an upgraded monitor software that can run multiple jobs at the same time if the different hardware resources are needed by the qued jobs)
- Time Sharing systems (A form of Multiprogrammed batch system where multiple-interactive-jobs can be run at ‘seemingly’ same time)
Ethos for code development (Windows vs Ubuntu)
With the launching of the first personal computer by IBM, Microsoft developed the OS
MS-DOS that ran on it. For Over a decade DOS was the underlying software for all
Microsoft’s OSs, which was inefficient in exploiting the hardware resources available
in the continuously upgrading technology. In 1993, windows developed a new OS from
scratch (Windows NT), that could exploit the capabilities of the contemporary
microprocessors and provided multitasking in both single and multi-user
environment.
Linux was created by a Finnish CS student, Linus Torvalds, as a variant for UNIX that
ran on IBM PC (Intel 80386), posted on the internet in 1991. Linux was made available
as an open source software under the endorsement of The Free-Software-Foundation
(FSF), whose GNU project provided the tools used by Torvalds to develop Linux-Kernel.
Over the years, many programmers have collaborated in the development of
Linux making it full-featured UNIX system that runs on most platforms.
In 2005, Torvalds created a distributed version-control system, GIT, to be used
in software development for tracking any changes made in the source code. This
boosted the Linux-Kernel development process, where thousands of developers are
making changes and adding to its source code.
The source code for Ubuntu originates from an older Linux distro – Debian which had
become less popular over the years because of infrequent updates. Created in 2004
by the company Canonical Ltd. Funded by Mark Shuttleworth, Ubuntu has frequently
topped the Linux distro charts.
Most Linux based distros (including Ubuntu) are available for free with their source
code available on the internet (Changes subjected to Open GPL License) whereas
Windows Operating systems are commercial software and have to be paid for.
The GNU Project
The GNU project, GNU being a recursive acronym for ‘GNU’s Not Unix’, was started by
Richard Stallman. The project supports the idea of free software where the term
‘free’ comes from ‘Freedom’ and not necessarily from the ‘Price’ of a software.
A program is considered free :-
- If it can be run for any purpose by the user
- The user has the freedom to modify the program to suit their need (i.e. Have access to the programs source code)
- The user has the freedom to redistribute copies, without charge or for a fee
- The user can distribute modified copies of the program to benefit the community from the improved version