HSR Sector 6 · Bangalore +91 96110 27980 Mon–Sat · 09:30–20:30
Chapter 9 of 20 — Linux Administration
intermediate Chapter 9 of 20

Shell Scripting — Bash Automation for System Administrators

By Vikas Swami, CCIE #22239 | Updated Mar 2026 | Free Course

Shell Scripting Basics — Shebang, Execution & Permissions

Understanding the foundational elements of bash shell scripting is essential for system administrators aiming to automate tasks efficiently. Shell scripting allows automating repetitive commands, managing system configurations, and streamlining complex workflows. The starting point is mastering the shebang line, execution methods, and file permissions, which collectively determine how scripts run and their security posture.

The shebang line, typically #!/bin/bash, appears at the top of a script and indicates which interpreter should execute the script. This line ensures that the script runs in the correct environment, especially important when multiple shells or interpreters are available. For example:

#!/bin/bash
echo "Hello, Linux World!"

To make a script executable, you must set the appropriate permissions. Using the chmod command, you can grant execute rights:

chmod +x script.sh

Once executable, you can run the script directly with:

./script.sh

Alternatively, invoking it explicitly with the interpreter works regardless of permissions:

bash script.sh

File permissions are critical for security. The typical permission set for scripts is 755 (owner can read/write/execute, others can read/execute), which can be set via:

chmod 755 script.sh

Understanding these basics forms the backbone of effective bash shell scripting. Proper shebang usage, execution methods, and permissions ensure scripts run smoothly and securely on Linux systems managed by network administrators. For more insights into Linux shell scripting, visit the Networkers Home Blog.

Variables, Quoting & Command Substitution

Variables are the core of dynamic scripting, enabling scripts to store and manipulate data during execution. In bash shell scripting, variables are created by assignment without spaces: varname=value. For example:

name="Networkers Home"
echo "Welcome to $name"

Variables are referenced with a dollar sign ($), but declaring them requires no spaces around the equals sign. To prevent issues with spaces or special characters, quoting variables is crucial. Double quotes (") allow variable expansion while preserving spaces, whereas single quotes (') treat content literally:

greeting="Hello, $name"       # expands $name
literal='Hello, $name'           # treats as plain text

Command substitution enables capturing command output into variables. There are two syntax options:

  • Backticks: var=$(command)
  • Modern syntax: var=$(command)

Example:

current_date=$(date)
echo "Today's date is $current_date"

This feature is instrumental in scripting for automating tasks like fetching system info, processing logs, or dynamic configuration. When combined with conditional logic, variables and command substitution form the backbone of robust scripts. For example, to check disk usage:

disk_usage=$(df / | tail -1 | awk '{print $5}')
if [ ${disk_usage%?} -gt 80 ]; then
  echo "Disk usage is above 80%"
fi

Mastering quoting and command substitution is critical for writing reliable, flexible scripts, especially in scenarios that demand precise data handling. For comprehensive tutorials, explore the Networkers Home Blog on Linux scripting best practices.

Conditionals — if/elif/else, test & Comparison Operators

Conditional statements in bash shell scripting allow scripts to make decisions based on runtime data. They form the basis of automation workflows, enabling scripts to execute different code paths depending on specific conditions. The fundamental construct is the if statement, often combined with test expressions or [ ] brackets.

Basic syntax:

if [ condition ]; then
    # commands
elif [ another_condition ]; then
    # commands
else
    # commands
fi

Comparison operators enable evaluating integers, strings, and file attributes. Common integer operators include:

  • -eq: equal to
  • -ne: not equal to
  • -lt: less than
  • -le: less than or equal to
  • -gt: greater than
  • -ge: greater than or equal to

String comparisons use operators like =, !=, and pattern matching with == or !=. For example:

if [ "$status" = "active" ]; then
  echo "System is active"
fi

Numerical comparison example:

cpu_load=75
if [ "$cpu_load" -gt 70 ]; then
  echo "High CPU load detected"
fi

Testing files involves checks like:

  • -e: exists
  • -f: is a regular file
  • -d: is a directory
  • -r: readable
  • -w: writable

Example:

if [ -f "/var/log/syslog" ]; then
  echo "Syslog exists"
fi

These conditional constructs are essential for scripting tasks such as automated health checks, configuration validation, and decision-based automation. For a detailed comparison of test operators and scripting techniques, visit the Networkers Home Blog.

Loops — for, while, until & Loop Control

Loops enable repeated execution of code blocks, vital for automating repetitive bash shell scripting tasks. Understanding for, while, and until loops allows administrators to process multiple files, monitor system states, or perform batch operations efficiently.

For Loop

The for loop iterates over a list or range. Syntax:

for item in list; do
  # commands
done

Example: iterating over log files:

for logfile in /var/log/*.log; do
  echo "Processing $logfile"
  gzip "$logfile"
done

While Loop

The while loop continues as long as the condition evaluates to true. Example:

count=0
while [ "$count" -lt 5 ]; do
  echo "Count is $count"
  ((count++))
done

Until Loop

The until loop executes until the condition becomes true:

counter=10
until [ "$counter" -le 0 ]; do
  echo "Countdown: $counter"
  ((counter--))
done

Loop Control Commands

  • break: exits the loop immediately
  • continue: skips to the next iteration

Example with control commands:

for i in {1..10}; do
  if [ "$i" -eq 5 ]; then
    break
  fi
  echo "$i"
done

Using loops efficiently reduces scripting complexity and enhances automation capabilities. Combining loops with conditionals enables complex workflows, like monitoring server health or managing batch configurations. For more advanced loop techniques, check the Networkers Home Blog.

Functions — Definition, Arguments, Return Values & Scope

Functions in bash shell scripting promote code reuse, modularity, and clarity. They allow defining blocks of code that can be invoked multiple times with different arguments, simplifying complex scripts and reducing errors.

Defining Functions

function_name() {
  # commands
}

Or:

function function_name {
  # commands
}

Calling Functions

function_name

Passing arguments is straightforward: arguments are accessed via special variables $1, $2, etc. For example:

greet() {
  echo "Hello, $1!"
}
greet "Alice"

Return Values & Exit Status

Functions return status codes via the return statement, typically 0 for success or non-zero for errors. To return data, use echo and capture output:

calculate_sum() {
  local sum=$(( $1 + $2 ))
  echo "$sum"
}
result=$(calculate_sum 5 10)
echo "Sum is $result"

Scope & Local Variables

Variables declared with the local keyword are confined to the function scope, preventing conflicts with global variables. Example:

my_function() {
  local temp_var="temporary"
  # do something
}

Functions streamline scripting by encapsulating logic, making scripts easier to maintain and extend. For instance, a backup script can define functions for compressing files, logging, and error handling. To explore more on this topic, visit the Networkers Home Blog.

Text Processing in Scripts — grep, awk, sed & Regular Expressions

Processing text is a core aspect of system automation, log analysis, and data extraction. Linux provides powerful tools like grep, awk, and sed for manipulating text streams efficiently. Mastery of regular expressions enhances their effectiveness.

Grep

grep searches for patterns within files or input streams. Example: find all lines containing 'error' in a log:

grep "error" /var/log/syslog

Awk

awk processes structured text, especially columns. Example: print the second column of a CSV:

awk -F',' '{print $2}' data.csv

Sed

sed performs in-place editing, substitution, and text transformations. Example: replace 'old' with 'new' in a file:

sed -i 's/old/new/g' filename.txt

Regular Expressions (Regex)

Regex patterns enable sophisticated text matching. For example, matching an IP address pattern:

grep -E '([0-9]{1,3}\.){3}[0-9]{1,3}' filename

Combining these tools allows creating pipelines for complex data parsing, such as extracting specific log entries or transforming configuration files. For instance:

cat /var/log/syslog | grep "CRON" | awk '{print $1, $2, $3, $5}'

Proficiency in shell scripting for text processing significantly enhances automation tasks. To delve deeper, refer to the detailed guides on the Networkers Home Blog.

Error Handling — Exit Codes, trap & set -euo pipefail

Robust scripts anticipate and handle errors gracefully to prevent unexpected failures. In bash shell scripting, this involves checking exit codes, using traps, and setting strict modes.

Exit Codes

Commands return an exit status: 0 indicates success; non-zero indicates failure. Checking exit codes helps determine script flow. Example:

some_command
if [ $? -ne 0 ]; then
  echo "Command failed"
  exit 1
fi

trap Command

trap catches signals or errors, allowing cleanup or notifications. For example, cleanup on script termination:

trap "echo 'Script interrupted'; exit" INT TERM

set -euo pipefail

These options enforce strict error handling:

  • -e: Exit immediately if a command exits with a non-zero status
  • -u: Treat unset variables as errors
  • -o pipefail: Return the status of the last command in a pipeline that failed

Implementing these options at the beginning of scripts ensures that errors are caught early, reducing debugging time. For example:

#!/bin/bash
set -euo pipefail
# Rest of script

Effective error handling is crucial for reliable automation, especially in environments managed by network administrators. For advanced error management techniques, see the Networkers Home Blog.

Real-World Scripts — Backup, Log Rotation, Health Checks & Reports

Practical bash shell scripting applications include automating backups, managing log files, performing health checks, and generating reports. These scripts help maintain system stability and streamline administrative tasks.

Automated Backup Script

#!/bin/bash
set -euo pipefail
backup_dir="/backup"
src_dir="/etc /var/www"
timestamp=$(date +"%Y%m%d_%H%M%S")
backup_file="${backup_dir}/backup_${timestamp}.tar.gz"

mkdir -p "$backup_dir"
tar -czf "$backup_file" $src_dir
echo "Backup completed: $backup_file"

Log Rotation Script

Maintains log file sizes and archives old logs:

#!/bin/bash
log_dir="/var/log/myapp"
archive_dir="/var/log/myapp/archive"
max_size=10485760  # 10MB

mkdir -p "$archive_dir"
for logfile in "$log_dir"/*.log; do
  size=$(stat -c%s "$logfile")
  if [ "$size" -gt "$max_size" ]; then
    mv "$logfile" "$archive_dir/$(basename "$logfile").$(date +"%Y%m%d%H%M%S")"
    touch "$logfile"
  fi
done
echo "Log rotation completed"

System Health Check & Reporting

Scripts that monitor CPU, memory, disk usage, and generate reports:

#!/bin/bash
cpu_usage=$(top -bn1 | grep "Cpu(s)" | awk '{print $2 + $4}')
mem_usage=$(free -m | awk 'NR==2 {print $3/$2 * 100}')
disk_usage=$(df / | tail -1 | awk '{print $5}' | sed 's/%//')

if (( $(echo "$cpu_usage > 80" | bc -l) )); then
  echo "High CPU usage: $cpu_usage%" | mail -s "Server Alert" admin@example.com
fi
if (( $(echo "$mem_usage > 80" | bc -l) )); then
  echo "High Memory usage: $mem_usage%" | mail -s "Server Alert" admin@example.com
fi
if [ "$disk_usage" -gt 80 ]; then
  echo "High Disk usage: $disk_usage%" | mail -s "Server Alert" admin@example.com
fi
echo "Health check completed"

These scripts exemplify automation that reduces manual intervention, improves reliability, and ensures proactive system management. To learn how to craft such scripts and more, visit the Networkers Home Blog.

Key Takeaways

  • Mastering shebang, execution permissions, and script setup is fundamental for reliable bash shell scripting.
  • Variables, quoting, and command substitution enable dynamic and flexible scripts.
  • Conditional statements and comparison operators facilitate decision-making in scripts.
  • Loops and control commands automate repetitive tasks efficiently.
  • Functions promote modular, reusable code, simplifying complex automation workflows.
  • Text processing tools like grep, awk, and sed are vital for log analysis and data manipulation.
  • Implementing error handling techniques ensures script robustness and predictable execution.
  • Practical scripts for backups, log rotation, and health checks are essential for system administration.
  • Continuous learning from resources like Networkers Home enhances scripting skills and automation capabilities.

Frequently Asked Questions

What is the importance of the shebang line in bash scripts?

The shebang line, typically #!/bin/bash, specifies which interpreter should execute the script. It ensures the script runs in the correct shell environment, making scripts portable and predictable across different Linux systems. Without it, scripts may run with the default shell, leading to compatibility issues. Proper shebang usage is fundamental for reliable bash shell scripting and is often a best practice recommended by system administrators at institutes like Networkers Home.

How can I automate repetitive tasks using shell scripting?

Automation of repetitive tasks is achieved through writing scripts that incorporate loops, conditionals, functions, and text processing tools. For instance, automating log rotation, backups, or system health checks can be scripted to run at scheduled intervals with cron jobs. Proper error handling and modular scripting improve reliability. Learning bash shell scripting fundamentals and practicing real-world scenarios, such as those shared by Networkers Home, empowers system administrators to automate complex workflows efficiently.

What are some common bash shell scripting mistakes to avoid?

Common errors include forgetting to set execute permissions, neglecting proper quoting of variables, assuming default shell behavior, and not handling errors correctly. These can lead to security vulnerabilities or script failures. Always use strict modes like set -euo pipefail, quote variables properly, and validate input. Additionally, testing scripts in controlled environments before deployment prevents unintended consequences. Continuous learning from resources like the Networkers Home Blog helps avoid pitfalls and develop best practices for shell scripting.

Ready to Master Linux Administration?

Join 45,000+ students at Networkers Home. CCIE-certified trainers, 24x7 real lab access, and 100% placement support.

Explore Course