exec redirects in bash

I write a lot of non-interactive scripts where I would like all output to go to a log file and have nothing appear on-screen.

To solve this, I’ve been using:


exec &> logfile
echo "Step One:"
echo "Step Two:"

I want to make sure this is a sane method.

Are there any significant drawbacks or issues I’m going to encounter if I move forward with this methodology? If so, what are they and how best can they be mitigated (including by changing my methodology).

Asked By: Rob


Redirection of command output to a log file

Redirecting all command output (including error messages) to a log file
is standard practice for non-interactive shell scripts.  
It’s particularly useful to have a record of command output for scripts
that are run by cron or triggered by some other external event,
and there are no downsides in such use-cases.

Many of my shell scripts include the following lines near the start:

exec 1>>"$logfile"
exec 2>&1

The order that these redirection commands is important. The first exec command redirects all writes to the stdout (1) stream to append (>>) to the log file. The second command redirects all writes to the stderr (2) stream to the same file descriptor that stdout (1) currently points to. Using only one file descriptor for accessing a file ensures that the writes happen in the desired order.

If using Bash, you can combine these commands into a single construct that does the same thing:

exec &>>"$logfile"

If you want the log file to be cleared of previous entries each time the script is run, use only a single > redirection operator (over-writes the previous contents):

exec &>"$logfile"

Use of the exec builtin for input/output redirection is specified by the POSIX definition for the Shell Command Language,
and the exec builtin is available in any POSIX compatible shell.

Redirection while running an interactive shell

You can experiment with redirecting standard output to a file while in a temporary/disposable interactive shell session. After running exec 1>outfile, all future commands print their output to outfile instead of to the terminal.

You can also experiment with redirecting standard error in an interactive shell session, but it can make the interactive shell session very hard to work with.

After running exec 2>errorfile, the standard error produced by any further commands is written to the redirected error file – as expected. However, the problem is that from now on, the shell (Bash in this case) prints its prompt to this file and any text typed as a command is also redirected to this file. 
Some shells (such as Bash) also echo characters received by stdin to stderr
In others, such as dash, for the rest of the shell session,
you’re essentially working blind, as nothing at all is sent to the terminal. 
This obviously makes it very difficult to continue interacting with the shell.

As Orion points out and Scott says,
you can store references to the default stdout and stderr file descriptors
before trying any such experiments using exec 3>&1 and exec 4>&2
When you’ve finished your experiments,
you can restore printing to standard error by running exec 2>&4
and restore printing to standard output with exec 1>&3.

For interactive use,
I’d advise redirecting standard out and standard error streams
on a command-by-command basis:
>> outfile 2>&1 command with arguments.

Answered By: Anthony Geoghegan
Categories: Answers Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.