Friday, February 27, 2009

Project: Building An All-Text Linux Workstation - Part 2

In this episode, we will choose the Linux distribution and perform the installation.

Choosing A Distribution

After some consideration, I selected Debian 5.0 for our workstation project for three reasons:
  1. Debian is fairly simple on a conceptional level. Good, straightforward design with good documentation. It's also capable of being installed on very small machines. Its minimum memory requirement is only 44 MB.
  2. Its package repository is huge. With over 22,000 packages, we are likely to find a good selection of text based applications for our project.
  3. It just came out and I wanted to play with it ;-)
Creating Installation Media

With that decision out of the way, we next decide on the installation media. I chose the "netinst" (also called the "minimal CD") option. This is a downloadable CD image that contains a minimal base system. Additional packages are installed from the Internet. The install image is 150 MB, much smaller than the full install CDs. To use this option you must have a working network connection. If you need other installation options, take a look at the Debian Installation Guide for guidance.

You can download the installation image from here.

The last step is creating the installation CD. I will assume that your present system, Linux or otherwise is capable of that and that you know how to do it.

Installation

With our machine ready, it's time to boot up with our install CD. After the CD boots you will receive an attractive Debian splash screen listing various install options. For our purposes, select "Install", not "Graphical Install".

The installer is pretty easy to use and in most cases the default selections are fine. The arrow keys move from selection to selection as do the tab/shift-tab keys. The space bar is used to toggle the contents of check boxes.

  • When the prompt appears for disk partitioning, select "Guided - Use entire disk" and "All files in one partition".
  • At the "Set up users and passwords" screen, you will be prompted to set the password for the root account. If you have been using Ubuntu up to this point, I have to explain that Debian, like most Linux distributions has a discreet root account rather than using sudo for everything as Ubuntu does. Much of the early work we will perform on the system will require root access. Choose a root password that is both strong, and one that you can easily remember.
  • Next, you will be prompted to create your personal account. In keeping with LinuxCommand tradition, I named my machine "Linuxbox" and created a user account named "me". You, of course, can use any name you like.
  • At the "Select and install software" screen you will be presented with a group of check boxes for different sets of packages to install. Using the space bar, select the "Print server" group and unselect the "Desktop environment" group.
  • If you have installed the system using the entire disk as suggested above, answer "yes" to the prompt on the "Install the GRUB boot loader on the hard disk" screen to install GRUB in the disk's master boot record (MBR).
The install should complete and prompt you to remove the install CD. Next, the machine will reboot and we should see the fruits of our labors.

A lot of boot messages should scroll by after the reboot and you will see at the very bottom a login prompt like this:

Debian GNU/Linux 5.0 linuxbox tty1

linuxbox login:

Enter the name root and then the root password and we will see a prompt like this:

linuxbox: ~#

Enter the command shutdown -h now and the machine will shutdown.

We're done for today.

Further Reading

Other installments in this series: 1 2 3 4 5 6 7 8 9 10 11 12 13 14

Wednesday, February 25, 2009

Project: Building An All-Text Linux Workstation - Part 1

When I see an old PC in the trash, I have a strong urge to rescue and adopt it like a stray puppy. My wife, of course, has very effective ways of constraining my behavior in this regard, so I don't have nearly as many computers as I want. I hate seeing computers go to waste. I figure if the processor and the power supply are still working, the computer should be doing something. So what if it can't run the latest version of Windows? It can still run Linux!

Over the next few weeks, I will show you how to take an old, slow computer and make it into a text-only Linux workstation with surprising capabilities, including document production, email, instant messaging, audio playback, USENET news, calendaring, and, yes, even web browsing.

Why would anyone want to build a text-only workstation? I don't know. Because we can! And besides, it's a great way to learn a bunch of command line stuff and that's why you're here, right?

So if you want to play along, find yourself a computer with a least the following:
  • Pentium processor or above
  • 64 MB or more of RAM
  • 2 GB or larger hard disk
  • PS/2 or USB mouse
  • A PCI Network card (no ISA or wireless please)
We're going to format over the existing software so don't use a valuable machine for this project. Many machines from the Windows 98 era should be good candidates. I will be using an HP Pavilion (circa 1996) with a 600 MHz Celeron, 320 MB of RAM and a 20 GB disk.

Good luck and I will see you again soon!

Further Reading

Other installments in this series: 1 2 3 4 5 6 7 8 9 10 11 12 13 14

Monday, February 23, 2009

Bash 4.0 Released Today

Version 4.0 of everyone's favorite shell program was released today. I imagine this will be showing up in new Linux distributions soon. A copy of the release announcement is here.

Friday, February 20, 2009

The Unix (Linux) Frame Of Mind

If you are migrating from Windows to Linux, one of the barriers you will encounter (besides the myriad superficial differences in the user interface) is the whole "culture thing."

Using a Unix-like operating system involves more than learning where the buttons are located on the desktop. It requires a different way of thinking about your computer. On top of that, Linux requires that you learn about your relationship with the Free and open source software communities. That will be the subject, no doubt, of many future postings, but for now we'll talk about the Unix frame of mind.

To understand Unix, you have to understand its historical context. Unix certainly has its faults, but if you understand its history, you can see that many of its "features" are due to the state of computing during its formative years. But it's also important to realize that just because something is old does not mean that it is necessarily bad. Take for instance, the command line interface. If you listen to pundits and the PC magazine writers, you would think that the command line is the equivalent of waterboarding to the modern computer user. I liken use of a graphical user interface to watching television and the command line to reading a book. They are very different, but both are valid (and valuable) if used in the proper way.

So what is "the Unix frame of mind?" It is based on some the following principles:

The computer is multi-user. Unix was designed for "big" computers. Big, expensive computers that supported many users at the same time. This has a number of implications. For one, it meant a different security model. On a multi-user computer, it is essential that one user cannot trash another user, or the entire system. It also implies that communities can form among the computer's users. If you have Fedora or any of the other Red Hat family of Linux distributions, you may have noticed that it runs sendmail by default. This is part of the Unix tradition. Users on a system send email to each other. The first form of instant messaging was the write command which sent a text message to another user's terminal session.

Automate everything. Back in the 1960's IBM used a marketing slogan, "Machines should work. People should think." Yet today, we have millions of office workers sitting in front of PCs pumping their mice all day. Why? It's because the computer is not doing the work. The human is. The Unix way is to instruct the computer how to do the work and then forget about it. Back in the 1990s I marveled at the cron daemon. It could perform tasks on a scheduled basis. I remember leaving my PC on all night and letting it do the work I needed and having a summary of the results each morning. It did the work so I didn't have to.

Everyone programs. The computer is a tool, or rather, it is a huge collection of tools that can be arranged to perform work. To do this, you must program. You have to break your tasks into simple steps and arrange the tools to perform them. Unix actually makes this fairly simple. You can create pipelines of commands or write shell scripts. Have you noticed how, over the years, Windows has removed all the programming tools from its system? This keeps the users helpless. If they want a solution, they have to go out and buy it. By contrast, every major Linux distribution includes the shell, perl, python, text editors and thousands of tools for building solutions. If you need more, there is just about every programming tool under the sun available for immediate download.

A computer is not "easy to use." That is a marketing myth. A computer is no more easy to use than a violin, but with much study and practice, both can produce beautiful music.

It all depends on your frame of mind.

Every Linux user should learn something about the history of Unix. The Wikipedia is a good place to start:

http://en.wikipedia.org/wiki/Unix

Also check out Eric Raymond's The Art Of Unix Programming:

http://www.faqs.org/docs/artu/

Wednesday, February 18, 2009

A Brief History Of Printing

The following is an excerpt from my upcoming book, "The Linux Command Line" due out later this year.

To fully understand the printing features found in Unix-like operating systems, we first have to learn some history. Printing on Unix-like systems goes way back to the beginning of operating system itself. In those days, printers, and the way they were used was much different than today.

Printing In The Dim Times
Like the computers themselves, printers in the pre-PC era tended to be large, expensive and centralized. The typical computer user of 1980 worked at a terminal connected to a computer some distance away. The printer was located near the computer and was under the watchful eyes of the computer’s operators.

When printers were expensive and centralized, as they often were in the early days of Unix, it was common practice for many users to share a printer. To identify print jobs belonging to a particular user, a banner page was often printed at the beginning of each print job displaying the name of the user. The computer support staff would then load up a cart containing the day’s print jobs and deliver them to the individual users.

Character-based Printers
The printer technology of that period was very different in two respects. First, printers of that period were almost always impact printers. Impact printers use a mechanical mechanism to strike a ribbon against the paper to form character impressions on the page. Two of the popular technologies of that time were daisy wheel printing and dot-matrix printing.

The second, and more important characteristic of early printers was that printers used a fixed set of characters that were intrinsic to the device itself. For example, a daisy wheel printer could only print the characters actually molded into the petals of the daisy wheel. This made printers of the period much like high-speed typewriters. As with most typewriters, they printed using monospaced (fixed width) fonts. This means that each character has the same width. Printing was done at fixed positions on the page and the printable area of a page contained a fixed number of characters. Printers, depending on the model, most likely printed ten characters per inch (CPI) horizontally and six lines per inch (LPI) vertically. Using this scheme, a US letter sheet of paper is eighty-five characters wide and sixty-six lines high. Taking into account a small margin on each side, eighty characters was considered the maximum width of a print line. This explains why terminal displays (and our terminal emulators) are normally eighty characters wide. It’s to provide a WYSIWYG (What You See Is What You Get) view of printed output using a monospaced font.

Data is sent to a typewriter-like printer in a simple stream of bytes containing the characters to be printed. For example, to print an “a”, the ASCII character code 97 is sent. In addition, the low-numbered ASCII control codes provided a means of moving the printer’s carriage and paper using codes for carriage return, line feed, form feed and the like. Using the control codes, it’s possible to achieve some limited font effects such as bold face by having the printer print a character, backspace and print the character again to get a darker print impression on the page.

Graphical Printers
The development of GUIs lead to major changes in printer technology. As computers moved to more picture-based displays, so to did printing move from character-based to graphical techniques. This was facilitated by the advent of the low-cost laser printer which, instead of printing fixed characters, could print tiny dots anywhere in the printable area of the page. This made printing proportional fonts (like those used by typesetters) and even photographs and high quality diagrams possible.

However, moving from a character-based scheme to a graphical scheme presented a formidable technical challenge. Here’s why: the number of bytes needed to fill a page using a character-based printer can be calculated this way (assuming sixty lines per page each containing eighty characters):

60 X 80 = 4800 bytes

whereas, a three hundred dot per inch (DPI) laser printer (assuming a eight by ten inch print area per page) requires:

(8 X 300) X (10 X 300) / 8 = 900000 bytes

The need to send nearly one megabyte of data per page to fully utilize a laser printer was more than many of the slow PC networks could handle, so it was clear that a clever invention was needed.

That invention turned out to be the page description language (PDL). A page description language is a programming language that describes the contents of a page. Basically it says, “go to this position, draw the character ‘a’ in ten point Helvetica, go to this position...” until everything on the page is described. The first major PDL was PostScript from Adobe Systems which is still in wide use today. The PostScript language is a complete programming language tailored for typography and other kinds of graphics and imaging. It includes built-in support for thirty-five standard high quality fonts and the ability to accept additional font definitions at run time. At first, support for PostScript was built into printers themselves. This solved the data transmission problem. While the typical PostScript program was very verbose in comparison to the simple byte stream of character-based printers, it was much smaller than the number of bytes required to represent the entire printed page.

A PostScript printer accepted a PostScript program as input. The printer contained its own processor and memory (often times making the printer a more powerful computer than the computer it was attached to) and executed a special program called a PostScript interpreter which read the incoming PostScript program and rendered the results into the printer’s internal memory thus forming the pattern of bits (dots) that would be transferred to the paper. The generic name for this process of rendering something into a large bit pattern (called a bitmap) is raster image processor or RIP.

As the years went by, both computers and networks became much faster. This allowed the RIP to move from the printer to the host computer. This permitted high quality printers to be much less expensive.

Many printers today still accept character-based streams, but many low-cost printers do not. They rely on the host computer’s RIP to provide a stream of bits to print as dots. There are still some PostScript printers too.

Today's Site Updates

  • Thanks to sharp-eyed readers Dmitry Zhuravlev-Nevsky and Alexander Wireen, I have corrected a bunch of typos in the "Writing Shell Scripts" section of the tutorials.

Saturday, February 14, 2009

Peanuts And Software

I really enjoy factory tours. I've always had a fascination with how things work and how things are made. Over the years, I've been on some great tours, Syracuse China, Corning Glass, Ben & Jerry's Ice Cream, and best of all, the (now defunct) General Motors assembly plant in Baltimore.

In the past, it seemed like many companies offered tours of their factories to the public, and why not? The great companies were truly proud of what they were doing and were happy to show you. This even applied to products whose production processes are better left unseen.

In recent years, factory tours have gotten harder to find. I don't know the exact reason for this. Maybe it's the fear of litigation, or a desire to keep everything proprietary, I just don't know.

The alert among you may have noticed the story in yesterday's New York Times announcing that the Peanut Corporation of America has filed for bankruptcy and will be going out of business. As you may know, Peanut Corporation of America recently gained notoriety for knowingly shipping peanut butter to its customers contaminated with salmonella resulting in over 600 reported illnesses and 9 deaths. As the scandal unfolded, reports surfaced of unsanitary conditions in its plants including cockroaches, leaking roofs, and a report of a dead rat in one of its peanut roasters.

Somehow, I don't think they offered a factory tour.

What does this have to do with software? Plenty. When you make something in the full light of day, it's a lot less likely that something horrible is going on under cover of secrecy. Open source code is like a factory tour for software. You can wander around and feel the pride of those who created it.

Proprietary software is different. What are they hiding? Does their program have "rats in the roasters?"

You'll never know.

Friday, February 13, 2009

Followup On My Dell Inspiron 530N

Had a problem with the monitor that came with my recently purchased Dell 530N with Ubuntu. After a couple of hours, the monitor would get brighter and brighter until it reached maximum brightness and then after a few minutes of that, the lamp in the monitor would go out leaving me with a black screen. I could turn the monitor off for a few minutes (and presumably, let it cool) and it would work for another couple of hours.

I got tired of that (Hey! I'm trying to write a book here!) so I called Dell's Ubuntu support number (1-866-622-1947) and got connected to an overseas call center. The representative had me perform a simple test, reboot the computer and press F2 to get to the setup screen, then just let it sit and see if the monitor would fail again. This test would eliminate the possibility of a video driver issue. Sure enough, after about 30 minutes, it went out again. I called back and talked to another representative who was able to see what had transpired during the first call. She arranged for a replacement to be shipped overnight to me. A refurbished monitor arrived the next afternoon.

I'm pleased with the service. Their support staff was polite and professional, but I didn't have a chance to test their Linux skills.

Maybe next time.