ECK
Back to Blog
4 min read

Bash Scripting Tips I Wish I Knew Earlier

Practical bash scripting patterns and best practices that make your scripts more robust, readable, and maintainable.

bashlinuxscriptingdevops

Bash Scripting Tips I Wish I Knew Earlier

After years of writing bash scripts for system administration, build pipelines, and automation, I've accumulated a collection of patterns that I wish someone had told me about from the start.

Always Start With Strict Mode

#!/usr/bin/env bash
set -euo pipefail

What this does:

  • set -e: Exit immediately if a command fails
  • set -u: Treat unset variables as errors
  • set -o pipefail: A pipeline fails if any command in it fails (not just the last one)

Without this, scripts silently continue after errors, leading to subtle bugs.

Use Functions

Structure your scripts with functions. It improves readability, testability, and reusability:

#!/usr/bin/env bash
set -euo pipefail

log() {
    echo "[$(date '+%Y-%m-%d %H:%M:%S')] $*" >&2
}

die() {
    log "ERROR: $*"
    exit 1
}

check_dependencies() {
    local deps=("docker" "git" "curl")
    for dep in "${deps[@]}"; do
        command -v "$dep" &>/dev/null || die "$dep is required but not installed"
    done
}

main() {
    check_dependencies
    log "Starting deployment..."
    # ... your logic here
    log "Deployment complete"
}

main "$@"

The main "$@" pattern at the bottom ensures all functions are defined before execution begins.

Variable Handling

Always Quote Variables

# Bad — breaks with spaces in filenames
rm $file

# Good
rm "$file"

# Even in conditionals
if [[ -f "$config_file" ]]; then
    source "$config_file"
fi

Default Values

# Use default if variable is unset or empty
name="${1:-default_name}"

# Use default only if variable is unset
name="${1-default_name}"

# Assign default to variable
: "${DEPLOY_ENV:=production}"

String Operations

file="archive.tar.gz"

echo "${file%.gz}"      # archive.tar (remove shortest suffix match)
echo "${file%%.*}"      # archive (remove longest suffix match)
echo "${file#*.}"       # tar.gz (remove shortest prefix match)
echo "${file##*.}"      # gz (remove longest prefix match)
echo "${file^^}"        # ARCHIVE.TAR.GZ (uppercase)
echo "${file,,}"        # archive.tar.gz (lowercase)
echo "${#file}"         # 14 (length)

Arrays Done Right

# Declare
files=("file1.txt" "file with spaces.txt" "file3.txt")

# Iterate (preserves elements with spaces)
for file in "${files[@]}"; do
    echo "Processing: $file"
done

# Length
echo "Count: ${#files[@]}"

# Append
files+=("file4.txt")

# Slice
subset=("${files[@]:1:2}")  # elements 1 and 2

Temp Files and Cleanup

# Create temp file/directory
tmpfile=$(mktemp)
tmpdir=$(mktemp -d)

# Ensure cleanup on exit (even on error)
cleanup() {
    rm -rf "$tmpfile" "$tmpdir"
}
trap cleanup EXIT

# Now use $tmpfile and $tmpdir freely

The trap ... EXIT is crucial — it runs the cleanup function regardless of how the script exits.

Process Substitution

Avoid temporary files when piping between commands:

# Compare two command outputs
diff <(sort file1.txt) <(sort file2.txt)

# Read from a command as if it were a file
while IFS= read -r line; do
    echo "Container: $line"
done < <(docker ps --format '{{.Names}}')

Safer File Operations

# Check before acting
[[ -d "$dir" ]] || mkdir -p "$dir"
[[ -f "$file" ]] || die "File not found: $file"
[[ -r "$file" ]] || die "File not readable: $file"

# Atomic writes (write to temp, then move)
tmpfile=$(mktemp)
generate_config > "$tmpfile"
mv "$tmpfile" "$config_file"

Parallel Execution

# Run tasks in parallel and wait for all
pids=()
for host in "${hosts[@]}"; do
    deploy_to "$host" &
    pids+=($!)
done

# Wait for all and check results
failed=0
for pid in "${pids[@]}"; do
    if ! wait "$pid"; then
        ((failed++))
    fi
done

[[ $failed -eq 0 ]] || die "$failed deployments failed"

Useful Patterns

Retry Logic

retry() {
    local max_attempts=$1
    shift
    local attempt=1

    until "$@"; do
        if ((attempt >= max_attempts)); then
            log "Failed after $max_attempts attempts: $*"
            return 1
        fi
        log "Attempt $attempt failed, retrying in $((attempt * 2))s..."
        sleep $((attempt * 2))
        ((attempt++))
    done
}

retry 3 curl -sSf "https://api.example.com/health"

Confirmation Prompt

confirm() {
    local prompt="${1:-Are you sure?}"
    read -r -p "$prompt [y/N] " response
    [[ "$response" =~ ^[Yy]$ ]]
}

if confirm "Deploy to production?"; then
    deploy
fi

ShellCheck

Always run ShellCheck on your scripts. It catches common bugs that are easy to miss:

# Install
sudo pacman -S shellcheck  # Arch
sudo apt install shellcheck  # Debian/Ubuntu

# Run
shellcheck my_script.sh

Conclusion

Bash scripting doesn't have to be fragile. With strict mode, proper quoting, structured functions, and cleanup traps, you can write scripts that are reliable and maintainable. The investment in learning these patterns pays off every time a script runs without surprises.