Last Updated: January 31, 2023
·
14.5K
· phs

Exponential Backoff in Bash

Ever need to download ten thousand files in a script? Writing loops in shell is easy enough, but dealing with intermittent failures can be less so.

A standard technique is exponential backoff. There, one sleeps between attempts, with each sleep duration growing by some scheme (usually geometrically) as more attempts are needed. This pattern can be extracted into a reusable snippet:

# Retries a command a with backoff.
#
# The retry count is given by ATTEMPTS (default 5), the
# initial backoff timeout is given by TIMEOUT in seconds
# (default 1.)
#
# Successive backoffs double the timeout.
#
# Beware of set -e killing your whole script!
function with_backoff {
  local max_attempts=${ATTEMPTS-5}
  local timeout=${TIMEOUT-1}
  local attempt=0
  local exitCode=0

  while [[ $attempt < $max_attempts ]]
  do
    "$@"
    exitCode=$?

    if [[ $exitCode == 0 ]]
    then
      break
    fi

    echo "Failure! Retrying in $timeout.." 1>&2
    sleep $timeout
    attempt=$(( attempt + 1 ))
    timeout=$(( timeout * 2 ))
  done

  if [[ $exitCode != 0 ]]
  then
    echo "You've failed me for the last time! ($@)" 1>&2
  fi

  return $exitCode
}

Note that most bash implementations use integer arithmetic, hence the sleep times are doubled each pass. Here's an example use:

with_backoff curl 'http://monkeyfeathers.example.com/'

Reposted from SO: http://stackoverflow.com/a/8351489/580412