How can I send stdout to multiple commands?

There are some commands which filter or act on input, and then pass it along as output, I think usually to stdout – but some commands will just take the stdin and do whatever they do with it, and output nothing.

I’m most familiar with OS X and so there are two that come to mind immediately are pbcopy and pbpaste– which are means of accessing the system clipboard.

Anyhow, I know that if I want to take stdout and spit the output to go to both stdout and a file then I can use the tee command. And I know a little about xargs, but I don’t think that’s what I’m looking for.

I want to know how I can split stdout to go between two (or more) commands. For example:

cat file.txt | stdout-split -c1 pbcopy -c2 grep -i errors

There is probably a better example than that one, but I really am interested in knowing how I can send stdout to a command that does not relay it and while keeping stdout from being “muted” – I’m not asking about how to cat a file and grep part of it and copy it to the clipboard – the specific commands are not that important.

Also – I’m not asking how to send this to a file and stdout – this may be a “duplicate” question (sorry) but I did some looking and could only find similar ones that were asking about how to split between stdout and a file – and the answers to those questions seemed to be tee, which I don’t think will work for me.

Finally, you may ask “why not just make pbcopy the last thing in the pipe chain?” and my response is 1) what if I want to use it and still see the output in the console? 2) what if I want to use two commands which do not output stdout after they process the input?

Oh, and one more thing – I realize I could use tee and a named pipe (mkfifo) but I was hoping for a way this could be done inline, concisely, without a prior setup 🙂

Asked By: cwd

||

Capture the command STDOUT to a variable and re-use it as many times as you like:

commandoutput="$(command-to-run)"
echo "$commandoutput" | grep -i errors
echo "$commandoutput" | pbcopy

If you need to capture STDERR too, then use 2>&1 at the end of the command, like so:

commandoutput="$(command-to-run 2>&1)"
Answered By: laebshade

You can use tee and process substitution for this:

cat file.txt | tee >(pbcopy) | grep errors

This will send all the output of cat file.txt to pbcopy, and you’ll only get the result of grep on your console.

You can put multiple processes in the tee part:

cat file.txt | tee >(pbcopy) >(do_stuff) >(do_more_stuff) | grep errors
Answered By: Mat

Just play with process substitution.

mycommand_exec |tee >(grep ook > ook.txt) >(grep eek > eek.txt)

grep are two binaries which have the same output from mycommand_exec as their process specific input.

Answered By: Nikhil Mulley

You can specify multiple file names to tee, and in addition the standard output can be piped into one command. To dispatch the output to multiple commands, you need to create multiple pipes and specify each of them as one output of tee. There are several ways to do this.

Process substitution

If your shell is ksh93, bash or zsh, you can use process substitution. This is a way to pass a pipe to a command that expects a file name. The shell creates the pipe and passes a file name like /dev/fd/3 to the command. The number is the file descriptor that the pipe is connected to. Some unix variants do not support /dev/fd; on these, a named pipe is used instead (see below).

tee >(command1) >(command2) | command3

File descriptors

In any POSIX shell, you can use multiple file descriptors explicitly. This requires a unix variant that supports /dev/fd, since all but one of the outputs of tee must be specified by name.

{ { { tee /dev/fd/3 /dev/fd/4 | command1 >&9;
    } 3>&1 | command2 >&9;
  } 4>&1 | command3 >&9;
} 9>&1

Named pipes

The most basic and portable method is to use named pipes. The downside is that you need to find a writable directory, create the pipes, and clean up afterwards.

tmp_dir=$(mktemp -d)
mkfifo "$tmp_dir/f1" "$tmp_dir/f2"
command1 <"$tmp_dir/f1" & pid1=$!
command2 <"$tmp_dir/f2" & pid2=$!
tee "$tmp_dir/f1" "$tmp_dir/f2" | command3
rm -rf "$tmp_dir"
wait $pid1 $pid2

If you are using zsh then you can take advantage of the power of MULTIOS feature, i.e. get rid of tee command completely:

uname >file1 >file2

will just write the output of uname to two different files: file1 and file2, what is equivalent of uname | tee file1 >file2

Similarly redirection of standard inputs

wc -l <file1 <file2

is equivalent of cat file1 file2 | wc -l (please note that this is not the same as wc -l file1 file2, the later counts number of lines in each file separately).

Of course you can also use MULTIOS to redirect output not to files but to other processes, using process substitution, e.g.:

echo abc > >(grep -o a) > >(tr b x) > >(sed 's/c/y/')
Answered By: jimmij

This may be of use: http://www.spinellis.gr/sw/dgsh/ (directed graph shell)
Seems like a bash replacement supporting an easier syntax for “multipipe” commands.

Answered By: sivann

For a reasonably small output produced by a command, we can redirect the output to temporary file, and send those temporary file to commands in loop. This can be useful when order of executed commands might matter.

The following script , for example, could do that:

#!/bin/sh

temp=$( mktemp )
cat /dev/stdin > "$temp"

for arg
do
    eval "$arg" < "$temp"
done
rm "$temp"

Test run on Ubuntu 16.04 with /bin/sh as dash shell:

$ cat /etc/passwd | ./multiple_pipes.sh  'wc -l'  'grep "root"'                                                          
48
root:x:0:0:root:/root:/bin/bash
Answered By: Sergiy Kolodyazhnyy

Here’s a quick-and-dirty partial solution, compatible with any shell including busybox.

The more narrow problem it solves is: print the complete stdout to one console, and filter it on another one, without temporary files or named pipes.

  • Start another session to the same host. To find out its TTY name, type tty. Let’s assume /dev/pty/2.
  • In the first session, run the_program | tee /dev/pty/2 | grep ImportantLog:

You get one complete log, and a filtered one.

Answered By: Victor Sergienko

There is also pee from the moreutils package. It is designed for it:

pee 'command1' 'command2' 'cat -'
Answered By: Xorax

Another take on this:

$ cat file.txt | tee >(head -1 1>&2) | grep foo

Works by redirecting tee‘s file argument to bash’s process substitution, where this process is head which prints only one line (header), and redirects it’s own output to stderr (in order it to be visible).

Answered By: Anthony
Categories: Answers Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.