Passing an argument to multiple commands in a single line

I’d like to be able to run multiple commands on the same file in a single line. The way I currently do this is:

commandA file && commandB file && perl -ne '...' file

My gut instinct tells me that there should be a way to provide the filename argument only once and pipe it to both commands simultaneously through xargs or something similar:

find file | xargs commandA && xargs commandB && xargs perl -ne '...'

When I try this, only the first command runs. How can I achieve what I want to do?

Asked By: Zaid

||

You can define a local variable for this:

f=file; commandA $f && commandB $f && ...

You can also execute all unconditionally (replacing && with ;) or in parallel (replacing && with &).

Alternatively, you can also use shell history expansion to reference previous arguments:

commandA file && commandB !:1 && ...
Answered By: maxschlepzig

For shells such as Bash, Korn and Z that have process substitution, you can do this:

find file | tee >(xargs commandA) >(xargs commandB) >(xargs perl -ne '...')
Answered By: Dennis Williamson

I wouldn’t vote for this myself. It’s silly and dangerous, but just in the interest of listing the ways to do this, there’s:

for cmd in "commandA" "commandB" "perl -ne '...'" ; do eval $cmd file ; done

Answered By: frabjous

You can use xargs to construct a command line e.g.:

echo file | xargs -i -- echo ls -l {}; wc -l {}

Just pipe the above into bash to run it:

echo file | xargs -i -- echo ls -l {}; wc -l {} | bash

Extending the example to all the *.c files in the current directory (escaping the ls here to prevent any shell alias substitution):

ls -1 *.c | xargs -i -- echo ls -l {}; wc -l {} | bash
Answered By: frielp
Categories: Answers Tags:
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.