Eat. Pray. Script.

Timing

To schedule your scripts for automated execution, use cron (Linux) or Task Scheduler (Windows). Often, you’ll need to time your scripts so that they execute sequentially to make a complex operation occur in the correct order. The only way to make this work is to make sure that your systems are time synchronized.

For example, if you have an initial script that executes at 07:00 AM and completes at 07:05 AM, the next script in the sequence needs to fire after 07:05 AM and so on down the script sequence. You’ll have to allow your scripts to run automatically a few times so that you can time them precisely.

A clever method of keeping scripts in sequence, in case one takes a bit longer than originally planned, is to have each script touch a file as its last step and then have the next script in the sequence check for the existence of that file. Scripts farther down in the list won’t fire until the file exists. To prevent accidental firing, remove the script at the beginning of the script that creates it.

Script1.sh

#!/bin/bash

rm /tmp/file1.txt
script actions...
touch /tmp/file1.txt

Script2.sh

#!/bin/bash

rm /tmp/file2.txt

if [ -f “/tmp/file1.txt” ]
then
script actions...
touch /tmp/file2.txt
else
mailx -s “Script failed at Step 1” admin@blah.com
fi

This script removes its check file and then checks for the check file from the previous script. If the script exists, then step two continues. If file1.txt doesn’t exist, the system emails the administrator for the site with an error message and a location to check (Step 1).

Anytime you create a complex script cascade, you should add mail notifications for successes or failures. It helps with debugging to do so. If you don’t have some break notifications built into your scripts, debugging becomes a very difficult process.

One thing to always check, when debugging failed scripts, is user permissions. The other is to provide your scripts with explicit paths to executable files that you reference. Always provide explicit pathnames for any executable that you use, even if they’re included in the PATH environment variable. Doing so will save you many frustrating hours of time during troubleshooting.

Summary

Scripting is as much of an art as it is a technical procedure. Keep it simple and don’t lose yourself in the process. Though it seems a bit ‘old school’ to do so, draw out a flow chart for the script. It helps. Remember to comment your scripts or keep documentation on what each step in the process does. Six months from now, you won’t remember the details.

Related content

  • Win-Win with Cygwin

    Windows administrators: Expand your horizons and your opportunities with Unix commands via Cygwin. Use Cygwin’s extensive list of Unix utilities for scripts, maintenance, compatibility, and automation.

  • Hands-on test of Windows Subsystem for Linux
    If you don't want to do without the main advantages of Linux on the Windows platform, the Windows Subsystem for Linux offers another option. We delve the depths of the Linux underworld and explain how you can optimize the subsystem.
  • PowerShell Part 3: Keeping PowerShell in the Loop

    PowerShell’s ability to use loops extends its reach to remote systems and performs repetitive operations.

  • GNU tools under Windows
    Many admins are responsible for heterogeneous IT environments. Those who want use the popular Bash and GNU tools under Windows can either install the Cygwin compatibility layer or try Gow, the more slender alternative.
  • Debugging Bash scripts automatically
    We look at various extension frameworks that make the life of developers and administrators easier when debugging a script.
comments powered by Disqus
Subscribe to our ADMIN Newsletters
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs



Support Our Work

ADMIN content is made possible with support from readers like you. Please consider contributing when you've found an article to be beneficial.

Learn More”>
	</a>

<hr>		    
			</div>
		    		</div>

		<div class=