Eat. Pray. Script.

Lazy … er, smart … administrators are hard at work in the background via scripts, cascading scripts, and cron jobs.

I don’t know if I’ll ever be as famous as Steve Jobs, but if you ask me how to perform just about any repetitive administrative task, I’ll tell you, “There’s a script for that,” and its corollary is: If there isn’t a script for that, write one.

If you look at any system administrator job description, there’s a listing for scripting somewhere in it. Sys admins must script. And, if you’re a good administrator, you’ve Googled to see whether some clever sys admin has already created the script you need. There’s no use reinventing the wheel or writing and debugging your own script if someone else has gone to the trouble for you. After all, you couldn’t rightfully earn the “lazy administrator” label if you started from scratch every time. And, by lazy, I mean smart.

Smart administrators attempt to automate every task possible with scripts: file transfers, file copys, temporary file removals, idle account lockouts, backups. On the surface, smart administrators appear lazy. The truth is, they’re hard at work in the background via scripts, cascading scripts, and cron jobs, and they’re hard at work in the foreground creating even more scripts to do their bidding.

A computer is the perfect minion: It’s dumb but ready to help, it never complains about what time it is when you ask it to do something; it does exactly what you tell it to do, and it doesn’t take breaks, vacation, or sick days. It’s always there ready to execute your every command.

Automation doesn’t neccessarily mean firing off a script to restart a service, copy a file or start a backup, although it certainly can mean that. For example, you have 300 servers that require a change to a particular configuration file. It might take days to complete that task manually, but, if you write a script to make the connection to the remote system, copy the file with the new information, restart the service, disconnect and proceed to the next system, the entire process might take less than an hour.

Consider a second example in which you must routinely change the Administrator password on hundreds of Windows server systems. How would you do it? Again, connecting to each one manually would be an overwhelming and lengthy task. Writing a simple script to handle the task takes a fraction of the time, and there’s no chance of a misspelling or accidental system omission along the way. A script performs the same tasks on each system it touches.

Which Language?

I’ve found no scripting language to be superior to any other for automation. In fact, for any particularly complex task, I might employ multiple languages. For example, Perl is an extremely mature and powerful language for extracting information from systems and databases. Modules exist for just about every possible parameter, registry entry, and system information bit that you can imagine. It also has a variety of database modules that allow you to connect to and query most database systems.

Bash is very good at handling lists and loops. Plus, you can call other scripts from it easily. For automation scripting, I always start with Bash and work my way through the other languages as needed per task. Although not written in stone, my standard language preference priority is: Bash, Perl, PHP, Expect. If I can’t do something in Bash (e.g., efficiently query a database), then I’ll use Perl or PHP. I turn to Expect for those times when I need the magic of automated shell interaction.

The difficult part of Expect is that you have to be able to predict and emulate shell responses. Autoexpect helps you sort through the unexpected shell responses and prompts. It isn’t perfect but it cuts down on your script debugging time.

PHP is an all-around good language like Perl. Often, when I find that a particular task is too difficult to do (for me) in Perl, I’ll create a PHP script to handle the processing. For certain jobs, such as creating dynamic web pages, its syntax and flexibility make quick work of an otherwise lengthy project. But, it’s also a matter of preference. You might feel the same way about Perl or Ruby.

But, the takeaway for language selection is to pick the one or ones with which you feel most comfortable. Since Perl, PHP, and Bash have a common C language heritage, it’s easy for me to cross-program in them all.

Dedicated Automation Systems

My personal preference is to create what I call “dedicated automation systems” to use as the pivot point for all automation. If I have a mixed environment comprising Windows, Linux, and Unix, I’ll spin up a virtual machine (VM) or a decommissioned physical machine for my automation systems. Creating dedicated systems separates them from those in production. If you maintain your automation systems outside of production, you can also reboot as necessary, reimage as necessary, install and uninstall applications as needed, and test at will.

Virtual machines are excellent choices to use as dedicated automation systems for all of the reasons given above. VMs also give you the opportunity to test and debug your scripts without harm to any production system.

Your requirements will vary, but on my automation systems, I always install the following components for Windows: Cygwin (complete), PHP, and PsTools. Cygwin is a huge subset of Linux/Unix tools for Windows systems, including Perl, Expect, and many others. Using it allows you to create cross-platform scripts for automation without having to use Windows tools in addition to Linux/Unix tools. You can leverage one set of skills for all operating environments.

PsTools, created by Mark Russinovich of Sysinternals (now owned by Microsoft), is a set of absolutely essential tools that are useful even in an all-Microsoft environment.

For Linux and Unix systems, I install Webmin, PHP, MySQL, Apache, and Expect. Perl is now a standard part of most Linux distributions. If it isn’t part of yours, install it. Use Webmin to add modules to your Perl installation. Its interface is far easier to use than CPAN’s command-line interface. Additionally, Webmin provides you with a web-based graphical interface for managing almost every aspect of your Linux or Unix system.

Windows Automation

Windows administrators have native automation tools such as command scripting (think DOS batch files), PowerShell, and VBScript. If you administer heterogeneous systems (Windows, Linux, Unix), then you can, and should, create automation scripts using a single, cross-platform language as previously discussed. You might not be able to leverage Windows scripts on Linux systems, or vice versa, without some changes to them, but you only have to become expert in one language, instead of several.

Although projects are underway that attempt to port PowerShell to Linux, none are complete or completely successful. Perl, PHP, Expect, and Bash are examples of cross-platform language possibilities that operate similarly on Windows, Linux, and Unix systems.

If you install Cygwin, you have the advantage of Windows and Linux commands on the same system. This creates a powerful hybrid automation host that allows you to issue Windows command-line programs in a Bash shell operating environment. So, you have the best of both worlds: Windows and Linux. You can use Windows commands mixed with Linux commands. For example, if you want to see a list of all systems on your network that begin with the letters DAL, issue the command (in Cygwin):

bash-3.2$ net view | grep -i DAL

\\DAL01
\\DAL02
\\DALAUTO
\\DALV01

The only thing you have to remember when dealing with Windows systems in Cygwin’s Bash environment is the Windows whack whack wackiness. You have to add an extra whack for each whack in a command involving hostnames.

Using the standard syntax, throws an error:

bash-3.2$ psexec \\DAL01 ipconfig

PsExec could not start \DAL01:
The system cannot find the file specified.

Using a Bash-compatible syntax yields the desired information:

bash-3.2$ psexec \\DAL01 ipconfig

Windows IP Configuration

Ethernet adapter Local Area Connection 5:

   Connection-specific DNS Suffix  . :
   IP Address. . . . . . . . . . . . : 192.168.1.235
   Subnet Mask . . . . . . . . . . . : 255.255.255.0
   Default Gateway . . . . . . . . . : 192.168.1.254
ipconfig exited on XENAPP1 with error code 0.

bash-3.2$

In Linux and Unix, the whack, or backslash (\), means that you want the system to ignore (leave as is) the next character that follows the whack. Therefore, when you use the standard Windows designation \\DAL01, you get an error because Windows expects two whacks and you’ve given it one. You have to use four whacks to provide the system with the required two.

Another advantage of Cygwin on Windows is that you can also incorporate its executables into your command (cmd) batch files with expected results.

C:\Temp> DIR |grep resume.txt

10/07/2011  04:05 PM  52,824 resume.txt 

C:\Temp

Remember to enter C:\Cygwin\bin into your PATH environment variable so your system can find the executables. The same rule applies to PsTools executables.

Linux and Unix Automation

It’s possible to administer Linux and Unix systems, in a mixed environment, using a Windows automation system but it would be impossible to do it the other way around. Linux lacks a complete complement of Windows tools. There’s no Cygwin for Linux to make it possible. There are a few cross-platform tools in Linux but nothing as extensive as Cygwin for Windows.

The trend has always been in the Windows-to-Linux direction, which has left a gap for those who use Linux extensively and want to manage Windows systems with it. Until that situation changes, you’ll have to use a Windows system to manage other Windows systems.

Linux to Linux management and automation is easy. Hundreds of tools, both command line and GUI, exist to help you on your quest. However, like Windows, it is the power of the command line that rules in the automation world. GUI environments are optional on server systems, including your dedicated automation system.

It’s inconvenient and unsafe to create scripts that contain passwords, which execute interactively. Fortunately, Linux lends itself to non-password connectivity and automation through the magic of SSH keyfiles. Numerous references exist to teach you how to setup passwordless SSH (including SFTP and SCP) between hosts. This feature of SSH is compelling and useful to the administrator. Your alternative is to create shell scripts that pipe in passwords from text files or to use Expect scripts that contain passwords.

Another problem with storing passwords in scripts or files isn’t obvious but when passwords change, every script that contains a password will also need to change. This becomes a time-consuming, laborious and often frustrating task.

Timing

To schedule your scripts for automated execution, use cron (Linux) or Task Scheduler (Windows). Often, you’ll need to time your scripts so that they execute sequentially to make a complex operation occur in the correct order. The only way to make this work is to make sure that your systems are time synchronized.

For example, if you have an initial script that executes at 07:00 AM and completes at 07:05 AM, the next script in the sequence needs to fire after 07:05 AM and so on down the script sequence. You’ll have to allow your scripts to run automatically a few times so that you can time them precisely.

A clever method of keeping scripts in sequence, in case one takes a bit longer than originally planned, is to have each script touch a file as its last step and then have the next script in the sequence check for the existence of that file. Scripts farther down in the list won’t fire until the file exists. To prevent accidental firing, remove the script at the beginning of the script that creates it.

Script1.sh

#!/bin/bash

rm /tmp/file1.txt
script actions...
touch /tmp/file1.txt

Script2.sh

#!/bin/bash

rm /tmp/file2.txt

if [ -f “/tmp/file1.txt” ]
then
script actions...
touch /tmp/file2.txt
else
mailx -s “Script failed at Step 1” admin@blah.com
fi

This script removes its check file and then checks for the check file from the previous script. If the script exists, then step two continues. If file1.txt doesn’t exist, the system emails the administrator for the site with an error message and a location to check (Step 1).

Anytime you create a complex script cascade, you should add mail notifications for successes or failures. It helps with debugging to do so. If you don’t have some break notifications built into your scripts, debugging becomes a very difficult process.

One thing to always check, when debugging failed scripts, is user permissions. The other is to provide your scripts with explicit paths to executable files that you reference. Always provide explicit pathnames for any executable that you use, even if they’re included in the PATH environment variable. Doing so will save you many frustrating hours of time during troubleshooting.

Summary

Scripting is as much of an art as it is a technical procedure. Keep it simple and don’t lose yourself in the process. Though it seems a bit ‘old school’ to do so, draw out a flow chart for the script. It helps. Remember to comment your scripts or keep documentation on what each step in the process does. Six months from now, you won’t remember the details.

References

Writing WMI Scripts
WMI Scripting Primer
ADSI Scripting Primer
ADSI Scriptomatic
Scriptomatic 2.0 Download
Microsoft’s TechNet Script Center Downloads
PsTools
Cygwin
Using Windows Administrator Commands
PHP Main Site
Comprehensive Perl Archive Network
Passwordless SSH
BASH Scripting How-To