Contrary to popular belief, Unix users aren’t masochists. They don’t want their lives to be unnecessarily difficult, so they’ve devised many ways to simplify their lives. Unix users (particularly those who use Linux) are especially blessed because they have access to many useful, highly configurable, not to mention free programs. Whether you’re a programmer, a sysadmin, or just an everyday user, there’s something out there for you.
Let’s take a look at some of these powerful tools and see why you should be using them. Maybe one of them will solve a nagging problem you’ve been having, or maybe you’ll discover a new way to do things. There’s a lot of ground to cover, so let’s get started.
Upon logging in, the first thing you are confronted with is a shell prompt. The shell is a fundamental part of the Unix experience. Even when using X Windows Systems, your desktop is still likely to contain a screen full of terminal emulators with shells you have to interact with. As Linux users, the shell you are most familiar with is probably bash, because it’s the default for most distributions.
However, there is another shell called zsh (http://zsh.sunsite.dk/) that deserves your attention, because it’s simply amazing. It’s so optimized for interactive use that most people would be quite shocked at the efficiency of a fully customized zsh environment.
Zsh’s strongest point is its advanced “tab completion” system. Most users are used to having filenames be completed when pushing [TAB], but zsh goes so much further. You can type cvs [TAB], and get a list of cvs sub-commands to choose from. You can type mutt -f=[TAB], and zsh will look in your ~/Mail directory (or wherever you’ve stored your mailboxes) and give you a list of them to choose from. Type ls -[TAB], and you’ll get a complete list of all the options. This is just the tip of the iceberg as far as this completion system goes.
You might be wondering, “But doesn’t it take a lot of configuration know-how to make zsh do all this?” The answer would be, “Yes, and no.” Yes, you’d have to be fairly knowledgeable about zsh to be able to design your own completion schemes.
However, the zsh community has already done a lot of the hard work of teaching zsh how to do all these crazy things. In fact, this vast library of “completions” is often completely ignored. All you need to do to enable all this magic is put the following lines in your ~/.zshrc file:
autoload -U compinit
For more information on how this works, you can access the zshcompsys man page or go to the Web site at http://zsh.sunsite.dk/, which has even more documentation.
Besides running commands from within a shell, another major activity in the life of a Unix user is editing text files. Whether you’re a sysadmin tweaking configuration files, a programmer hacking away at source code, or composing email messages, Unix users spend an inordinate amount of time with their text editor.
One of the most famous in the Unix community is Emacs (http://www.gnu.org/software/emacs), but to call it a mere “text editor” would not do it justice. Neal Stephenson (author of In the Beginning was the Command Line) likened Emacs to a “thermo-nuclear word processor,” and that gets a bit closer to the truth.
Emacs’ power comes from the Lisp interpreter that is integrated into it. With it, people have added extensions to Emacs that take it far beyond text editing (see Table One). Although they’re called “extensions,” they’re really more like full-blown applications; this means Emacs is also a platform for application development. This has given Emacs the ability to gracefully adapt to the changing needs and expectations of its users.
Table One: Lisp Extensions To Emacs
Browse the Web.
Be an RPN calculator
Read Usenet News and Email
Be your therapist
Play a game of gomoku against you
NOTE: [Meta]-x means depressing the Meta and x keys at the same time (called a “key chord”). The Meta key is usually the Alt or Option key. If that doesn’t work, you can emulate the key chord by pushing and releasing the Escape key and then pushing and releasing the x key.
About a decade after GNU/Emacs hit the scene, the GIMP (GNU Image Manipulation Program, http://www.gimp.org) was born, and it approached the problem of extensibility in a similar way. It started out with a basic feature set comparable to the kinds of things Adobe Photoshop can do. Then a Scheme interpreter was embedded and given hooks into the graphics manipulation engine so that the GIMP could be controlled by a script. Some said it was like Photoshop but better, because it could be scripted and it was free. Word spread quickly about this amazing program that broke all the stereotypes about Unix systems being graphically primitive. Every time the GIMP came a step closer to its production release, there would be a story posted on Slashdot about it, and there was a real sense of excitement and hope. That was about three years ago.
The pace of development has calmed down somewhat, and it’s become a more mature system. Some of the features that have been added to the more recent 1.2 series include:
More tools (Ink Pen, Dodge and Burn, Smudge, and Measure). Also, at the bottom of the tool palette, there is now a convenient way to get to the brush, gradient, and pattern selection dialogs.
A new plugin called GFig provides the GIMP with support for simple line-based drawing. If you ever wanted to draw simple shapes like rectangles and circles with the GIMP, navigate to the “Filters -> Render -> GFig” menu.
Another useful plugin is called ImageMap, and can be found in “Filters -> Web -> ImageMap”. It’s a tool for graphically defining client-side image maps, which can be helpful in simplifying your Web site.
In addition to the Scheme interpreter, you can now embed a Perl, Python, and/ or Tcl interpreter, taking the GIMP’s extensibility to new levels.
The benefit here is that knowledge of how to perform a given graphics manipulation technique can be codified into a script. For instance, you might not know exactly how to create a drop-shadow effect, but picking “Script-Fu -> Shadow -> Drop Shadow” from the right-click menu isn’t hard at all. And there’s a lot more that where that came from; check out http://www.gimp.org/scripts.html.
The real winners here are the graphically-challenged folk out there who are overwhelmed by the complexity of fancy graphics programs, but at the same time would love having a nice (GIMP-generated) logo for their home page.
One of the nicest side effects of the GIMP project was the creation of GTK+ (http://www.gtk.org). It’s a C library for GUI development, but it also has bindings for practically any language you would want to program in. This alone makes it much easier to develop X Windows Systems GUIs than it had ever been before.
However, it was still tedious to manually code GTK+ user interfaces. This is where Glade (http://glade.gnome.org) comes in to rescue programmers from that awful task. Glade is a tool to design GUIs that use the GTK+ toolkit. You interactively arrange your widgets (buttons, labels, text boxes, etc.) on your window to build your GUI and when you save your work in Glade, the interface you’ve built is stored as an XML document.
Once you have this XML document, there are a number of things you can do with it. Glade can transform this data into C, C++, Ada95, Perl, or Eiffel code. Also, there’s a C library called libglade that knows how to parse this XML document and create a GTK+ GUI at run-time. This is especially nice, because it means you don’t have to recompile your code to update the interface for your program: you just change the XML file and restart your application.
When you’re programming, you can sometimes make a real mess of things, and being able to go back to an older version (that works) is an absolute necessity. To facilitate this task, people have devised various revision control systems, but there is one that stands out above the rest.
That program is CVS (Concurrent Versions System, http://www.cvshome.org/), and it deserves a lot of the credit for enabling the current Open Source development environment. The innovation that set CVS apart from the rest was its ability to function over the network. Suddenly, it became much easier for developers who were spread out geographically to work with each other on the same project.
Before CVS, people had to email patches back and forth. If you were receiving a patch, you had to hope you had the right version of the files so that the patch would apply correctly. If you were sending a patch, you had to hope that you were using the most up-to-date versions, and that nobody anywhere else was trying to send out a patch to any of the files you had just changed.
Nowadays, CVS is an integral part of the whole Open Source experience. There are CVS repositories hosted everywhere, the most prominent site being SourceForge.
Also, in true Unix fashion, a lot of tools built around the CVS system:
CVSweb is a Web application that lets people browse through the contents of a CVS repository.
CvsGraph is a tool for generating a graphical representation of all the revisions and branches of a given file in a CVS repository.
Cvstat (there’s only one “s”) is a tool for generating reports on things like the number of commits made by a person, how many lines of code a person has changed, average number of lines changed per commit, etc.
If you consider yourself a developer, you really need to know how to use CVS, because there are thousands of projects (Open Source or otherwise) out there, and a large number of them keep their code in CVS. To use it at a basic level, you only need to know five CVS commands: checkout, checkin, update, add, and delete.
Almost all of the source code you can download has been written to be compiled by a program called GCC (GNU Compiler Collection, http://gcc.gnu.org). It’s easily the most important piece of Free Software ever written. Back in 1984, when Richard Stallman started on his quest to implement a totally free system called GNU (GNU’s Not Unix), he realized that a C compiler would be absolutely essential. To make a long story short, he rolled up his sleeves and wrote the whole C compiler all by himself. Then what did he do? He put his money where his mouth is and gave it away for free.
It can be hard to understand why he did this. A product like GCC could sell for thousands of dollars, but try to imagine a world without GCC and then Stallman’s actions become easy to understand. Without GCC there would be no Linux, and without Linux, many of today’s tools and projects would not exist. It’s even possible that without GCC, there would be no World Wide Web as we know it today. Tim Berners-Lee first implemented the HTTP protocol on a NeXTstep system where GCC was the native compiler. It’s amazing that one program could have such a drastic effect on the course of world events.
GCC is an impressive product. The acronym used to stand for “GNU C Compiler,” but since it can also compile C++, Objective-C, Java, Fortran, Ada, and Chill, it has been aptly renamed the “GNU Compiler Collection.” On top of that, it can generate code for many different CPU architectures, which makes it a favorite among the embedded crowd where GCC serves as a cross-compiler.
It’s so versatile, and so crucial at the same time. Every day, all around the world, GCC gets invoked millions of times. We often take GCC for granted, but if it were to disappear, the computing world would come to a grinding halt.
Sometimes, no matter how good GCC is, things still come to a grinding halt (or should that be segfault). It’s discouraging when your programs crash, but you can always turn to your debugger to help you get to the root of the problem and fix it. Unfortunately, Unix debuggers have traditionally had command-line interfaces with often-terse syntax.
Figure One: DDD displaying a linked list.
However, there is a graphical debugger for Unix called DDD (Data Display Debugger, http://www.gnu.org/software/ddd) that remedies this situation quite nicely. It started out as a graphical front-end for GDB (the GNU Debugger), but support has since been added for DBX (used by various proprietary Unix systems), WDB (HP-UX), Ladebug (Tru64 and Linux/Alpha), JDB (Java Debugger), XDB (an older HP-UX product), and the Perl and Python debuggers. The primary benefit of this approach is that it wraps a single user interface around many different command line interfaces, so you only need to learn one way to debug and be done with it. It works just like you would expect from any debugger, letting you step through the source code line by line, setting break points, and so on.
DDD can also be used to examine variables, and it does so in a very slick way. If you have data structures that make heavy use of pointers or references, DDD can graphically represent the connections between them (see Figure One).
Being able to debug a program is nice, but sometimes you may not even get that far. For example, compiling Apache with third-party modules can be very difficult. There are many Apache modules out there, and it can seem that every module builds itself differently from every other module. Even worse, in order to get a module to build, patches sometimes need to be applied and extra libraries may need to be installed. With a non-trivial Apache installation, this can quickly escalate into a nightmare.
That’s precisely where Apache Toolbox (http://www.apachetoolbox.com) comes in to save the day. It’s a build tool for Apache that distills the procedure for downloading, compiling, and installing practically every module for Apache that’s out there. At the time of this writing, there are 36 standard and 63 third-party modules you can mix and match in any combination you want.
Using it is very straightforward, as it is completely menu-driven. It’s just a matter of picking the pieces you want, and getting some caffeine while Apache Toolbox does its magic. It frees you to take a more adventurous approach to your Apache builds.
Whenever there’s any talk about Apache, don’t be surprised if Perl (http://www.perl.com) is mentioned soon after. The two have been working with each other ever since the early days of the Web. Perl happened to be very effective for CGI programming, and it didn’t take long before Perl filled in this ecological niche.
Since those times, the Web has come a long way, and there are many technologies that give Perl a run for its money when it comes to Web development. This makes some people in the Perl community worry that Perl may be falling behind, but perhaps they haven’t noticed that Perl has moved on. While no one was looking, Perl started getting into things like ASIC verification and bioinformatics, being well received in both cases. Perl always seems to find these niches to fill.
This happens because Perl is a lot more powerful than people give it credit for, but it’s power comes in an interesting way. Perl has an unusual knack for bringing things together. It does this better than anything else out there, and it does so on many different levels. Here are some examples:
Perl brings Web sites and databases together via the CGI and DBI modules.
The CPAN (Comprehensive Perl Archive Network) organizes all the modules of the world together and makes it easy to search through.
The Inline modules allow Perl to execute code from other programming languages as if they were native Perl. For example, Inline::Python lets programmers use Python objects just like they were using Perl objects.
There are a lot of concepts and syntax from many different programming languages in Perl. It’s always nice to see something familiar when you’re learning something new. Even first-time programmers can benefit from this if they can recognize Perl’s likeness to spoken English.
Perl is all about making connections, building bridges, and coaxing things that weren’t designed to work together to work together anyway.
Lessons to Be Learned
Looking at these programs, there’s a common theme that can be found in them all. Perl is not the only tool that’s bringing things together; every single tool here does it. Considering the Unix heritage of most of these programs, it’s not surprising. Unix users have long known that the sum of the parts is often greater than the whole, and the tools mentioned here are excellent examples of this concept of synergy.
It’s a humble little program, but it revolutionized open source development by accelerating the pace at which it could take place. Back when 1200 baud used to be fast (it’s equivalent to 0.0012 Mbps), it wasn’t really feasible for developers to send source code back and forth.
However, when you make changes to source, you tend to only make changes in small portions of the file. One day, Larry Wall thought, what if people could just send the changes instead of sending whole files? Sending just the changes was already possible because diff already existed, but the problem was that there was no tool to take the output from diff and use it to update the files. To solve this, Wall wrote a patch, and the world became a better place.
“Surfraw provides a fast Unix command-line interface to a variety of popular WWW search engines and other artifacts of power. It reclaims google, altavista, babelfish, dejanews, freshmeat, research index, slashdot and many others…
“Surfraw abstracts the browser away from input. Doing so lets it get on with what it’s good at — browsing.”
Surfraw lets you do the following kinds of things:
$ ask who is jeeves?
$ google -results=100 RMS, GNU, which is sinner, which is sin?
$ austlii -method=phrase dog like
$ rhyme -method=perfect Julian
John Beppu encourages you to use your computer skillfully and creatively. He can be reached at firstname.lastname@example.org.
Fatal error: Call to undefined function aa_author_bios() in /opt/apache/dms/b2b/linux-mag.com/site/www/htdocs/wp-content/themes/linuxmag/single.php on line 62