Execute a command once per line of piped input?

I want to run a java command once for every match of ls | grep pattern -. In this case, I think I could do find pattern -exec java MyProg '{}' ; but I’m curious about the general case – is there an easy way to say “run a command once for every line of standard input”? (In fish or bash.)

Asked By: Xodarap


That’s what xargs does.

... | xargs command
Answered By: Keith

In Bash or any other Bourne-style shell (ash, ksh, zsh, …):

while read -r line; do command "$line"; done

read -r reads a single line from standard input (read without -r interprets backslashes, you don’t want that). Thus you can do either of the following:

$ command | while read -r line; do command "$line"; done  

$ while read -r line; do command "$line"; done <file
Answered By: Steven D

I agree with Keith, xargs is the most general tool for the job.

I usually use a 3 step approach.

  • do the basic stuff until you have something you would like to work with
  • prepare the line with awk so it gets the correct syntax
  • then let xargs execute it, maybe with the help of bash.

There are smaller and faster ways, but this ways almost always works.

A simple example:

ls | 
grep xls | 
awk '{print "MyJavaProg --arg1 42 --arg2 "$1""}' | 
xargs -0 bash -c

the 2 first lines selects some files to work with,
then awk prepares a nice string with a command to execute and some arguments and $1 is the first column input from the pipe.
And finally I make sure that xargs sends this string to bash that just execute it.

It is a little bit overkill, but this recipe has helped me in a lot of places since it is very flexible.

Also note, you can also do xargs -0 -n1 bash -c (just adding the -n1 flag suggested by Michael Goldshteyn) to execute the command on each line of output.

Answered By: Johan

When dealing with potentially unsanitized inputs, I like to see the entire job ‘spelled out’ line by line for visual inspection before I run it (especially when it’s something destructive like cleaning people’s mailbox’s).

So what I do is generate a list of parameters (ie. usernames), feed it to a file in one-record-per-line fashion, like this:


Then I open the list in vim, and mangle it with search and replace expressions until I get a list of full commands that need to get executed, like this:

/bin/rm -fr /home/johndoe  
/bin/rm -fr /home/jamessmith 

This way if your regex is incomplete, you will see in what command will have potential problems (ie. /bin/rm -fr johnnyo connor). This way you can undo your regex, and try it again with a more reliable version of it. Name mangling is notorious for this, because it’s hard to take care of all the edge cases like Van Gogh, O’Connors, St. Clair, Smith-Wesson.

Having set hlsearch is useful for doing this in vim, as it will highlight all the matches, so you can easily spot if it doesn’t match, or matches in an unintended way.

Once your regex is perfect and it catches all the cases you can test for/think of, then I usually convert it to a sed expression so it can be fully automated for another run.

For cases where the number of lines of input prevents you from doing a visual inspection, I highly recommend echoing the command to the screen (or better yet, a log) before it executes, so if it errors out, you know exactly which command caused it to fail. Then you can go back to your original regex and adjust once more.

Answered By: Marcin

GNU Parallel is made for that kind of tasks. The simplest usage is:

cat stuff | grep pattern | parallel java MyProg

Watch the intro video to learn more: http://www.youtube.com/watch?v=OpaiGYxkSuQ

Answered By: Ole Tange

Also, while read loop in fish shell (I assume you want fish shell, considering you used tag).

command | while read line
    command $line

Few points to note.

  • read doesn’t take -r argument, and it doesn’t interpret your backslashes, in order to make most common use case easy.
  • You don’t need to quote $line, as unlike bash, fish doesn’t separate variables by spaces.
  • command by itself is a syntax error (to catch such use of placeholder arguments). Replace it with the real command.
Answered By: Konrad Borowski

The accepted answer has the right idea, but the key is to pass xargs the -n1 switch, which means "Use at most 1 argument per command line":

cat file... | xargs -n1 command

Or, for a single input file you can avoid the pipe from cat entirely and just go with:

<file xargs -n1 command

Updated 2020-08-05:

I would also like to respond to the advice found in user Jander’s comment, which was heavily upvoted, despite containing some amount of misinformation as I will now explain.

Do not be so hasty to recommend the -L option of xargs, without mentioning the trouble that its (so called) trailing blank(s) feature can lead to. In my opinion this switch causes more harm than good and is certainly a stretch to use to mean, for the case of -L 1, act on one non-empty line at a time. To be fair, the man page for xargs does spell out the features (i.e., issues) that come for the ride with the -L switch.

Since Jander made no mention of the issues when mentioning -L to perhaps a hasty unsuspecting StackOverflow audience seeking quick tips and not having the time for such tedious things as reading man pages rather than accepting comments and answers as gospel, I will now present my case for why -L is a very bad suggestion without a careful understanding of all of the baggage that it brings along for the ride.

To illustrate my disdain for -L, let’s consider a simple input file that consists of the following text someone carelessly entered (perhaps a high-school summer intern that created this data file as part of his/her training, as evidenced by its Windowish filename. As luck (karma?) would have it, you have been selected by management to be its new custodian):



Because of the fact that the line that contains the digit 2 has a space character (shown as a Unicode SYMBOL FOR SPACE glyph after the digit 2 in the preceding code, in case your browser’s font does not have a visual representation for this character), a command that uses xargs -L1, such as:

<testdata.txt xargs -L1 echo

…, would produce the following (perhaps surprising) output:

2 3

This is caused by the fact that the -L switch instructs xargs to append subsequent lines to those that end with blanks, a behavior that may only effect the resulting output in those oddball moments where lines are not properly trimmed of trailing blanks – a time bomb bug waiting for the right input file to present itself.

On the other hand, the same command using the -n 1 switch of xargs, instead of -L 1 would produce a far more acceptable output of:


And that’s not even the worst of it! The -L switch unlike -n forces the "dreaded" -x option of xargs to go into effect. This causes termination of the xargs process if a command line is encountered that it deems too long for the environment on which it is run.

An input file consisting of many lines with trailing blanks in succession could, as instructed by the -L switch and its use of the chemical agent known as Agent -x into the mix, potentially cause xargs to terminate midstream if the concatenation of all of these into one superline exceeds xargs‘ definition of line is too long for a command line. If things are beginning to look murky, consider that line is too long is a size determined by xargs based on the max length specified for the platform on which it is run and further offset by a seemingly arbitrary constant as explained in more detail in the man page. Remember those pesky indefinite integrals from Calculus and their arbitrary constants and losing a point on a quiz or test, because you forgot to write + C after your solution for an indefinite integral? Well, that phrase is back with a vengeance, to bite you on the rear once again, if add -L to your handy xargs toolkit.

A -n value of 1, on the other hand, would just chop up those long lines into (hopefully) small bite sized one-line chunks and execute the command supplied to xargs for each of them, one at a time, without any consideration given to whether they end with blanks or not. No more long lines and no more of xargs stabbing you in the back by terminating abruptly – Et tu, Brute -x?

An optional segue regarding phrasing in the xargs man page

I don’t know why the ambiguous and non-standard word blanks was used throughout the xargs man page, instead of far better defined and less ambiguous options such as:

  • space(s), if blanks means one or more ASCII space characters
  • whitespace(s) other than new-lines (if that’s what blanks implies)
  • one or more non-printable characters from the set: {space, horizontal tab} (if blanks was used as a synonym for this gruesome twosome)

Updated 2021-06-15:

User @BjornW asked how xargs could be used to run a command once per line of input and not just word of input. (See, I do read the comments, and I’ll just blame the seven months it took to respond on Covid 😛 ).

In the spirit of the original question, as asked, and to make my answer applicable to a greater number of use cases, I would like to address this particular scenario in detail.

Consider the following input file. It is chock-full of various edge cases one may actually encounter in the Real World™ (e.g., leading/trailing spaces, lines consisting only of spaces, empty lines, lines beginning with a hyphen [which should not get interpreted as the introduction of a switch], etc.):


a1 a22 a333 a4444
b4444 b333 b22 b1
␠␠c d e f g
ii jj kk␠
-L and -x are the gruesome twosome

In the preceding input file, the Unicode character OPEN BOX U+2423 was used to mark empty lines and the Unicode SYMBOL FOR SPACE was used for leading and trailing spaces, in order to make them more prominent.

Let’s say we want to run a command on each line of input, taken as a whole, and passed to our command as a single argument, regardless of content (including no content). We would go about this using xargs, as follows (Note: printf will be our sample command and the %q format specifier will be used in order to enclose the supplied argument in apostrophes for clarity, when spaces are present or the argument is an empty string – all in, only our hhh input line was left "unscathed," by %q, as you will see in the output, presented shortly. Had any non-printable characters been present, they would have also gotten escaped by %q using the POSIX $'' quoting syntax]):

<lines.txt xargs -n1 -d'n' printf -- 'Input line: %qn'

The output is as follows:

Input line: 'a1 a22 a333 a4444'
Input line: 'b4444 b333 b22 b1'
Input line: '   c d e f g'
Input line: ''
Input line: hhh
Input line: 'ii jj kk '
Input line: ''
Input line: '   '
Input line: '-L and -x are the gruesome twosome'
Input line: ''
Input line: ''
Input line: ''

So, there you have it. Using the -d switch, we can specify the delimiter that xargs should look for in our input file to indicate where an argument ends and the next one begins. By setting it to 'n' which xargs itself is smart enough to interpret as a C-style character escape, as stated in the description of the -d switch on its man page, we can use xargs to forward entire lines of input to our command of choice as arguments, with minimal effort on our part.

I would like to also mention that xargs can be used to concatenate multiple lines of input (with a caveat I will detail at the end of this paragraph), for the rare cases when such behavior is desired, and forward them as a single argument to our command. This can be accomplished by setting the number passed to the -n switch to xargs in the above invocation command, to a value that is indicative of the number of lines of input that should get merged into a single argument with their n line endings removed as part of the process. Unfortunately, this new-line stripping behavior makes the aforementioned xargs approach unsuitable for many use cases, because information that is indicative of where one line ends and the next begins, gets lost in the process.

Answered By: Michael Goldshteyn

If a program ignores the pipe but accepts files as arguments, then you can just point it to the special file /dev/stdin.

I am not familiar with java, but here is an example of how you would do it for bash:

$ echo $'pwd n cd / n pwd' |bash /dev/stdin

The $ is necessary for bash to translate n into newlines. I’m not sure why.

Answered By: Rolf

If you need to control where exactly the input argument is inserted into your command line or if you need to repeat it several times then you can use xargs -I{}.


Create an empty folder structure in another_folder that mirrors the subfolders in the current directory:

    ls -1d ./*/ | xargs -I{} mkdir another_folder/{}


Apply an operation on a file list coming from stdin, in this case make a copy of each .html file by appending a .bak extension:

    find . -iname "*.html" | xargs -I{} cp {} {}.bak

From the xargs man page for MacOS/BSD:

 -I replstr
         Execute utility for each input line, replacing one or more occurrences of
         replstr in up to replacements (or 5 if no -R flag is specified) arguments
         to utility with the entire line of input.  The resulting arguments, after
         replacement is done, will not be allowed to grow beyond 255 bytes; this is
         implemented by concatenating as much of the argument containing replstr as
         possible, to the constructed arguments to utility, up to 255 bytes.  The
         255 byte limit does not apply to arguments to utility which do not contain
         replstr, and furthermore, no replacement will be done on utility itself.
         Implies -x.

Linux xargs man page:

   -I replace-str
          Replace  occurrences of replace-str in the initial-
          arguments with names read from standard input.  Al‐
          so,  unquoted  blanks do not terminate input items;
          instead the separator  is  the  newline  character.
          Implies -x and -L 1.
Answered By: ccpizza

I prefer this – allowing multi-line commands and clear code

find -type f -name filenam-pattern* | while read -r F
  echo $F
  cat $F | grep 'some text'

ref https://stackoverflow.com/a/3891678/248616

Answered By: Nam G VU

Here, a copypaste you can immediately use:

cat list.txt | xargs -I{} command parameter {} parameter

The item from the list will be put where the {} is and the rest of the command and parameters will be used as-is.

Answered By: DustWolf

This is possible with xargs, as other answers indicated. We need to distinguish two specifics in the “once per line” part of the question:

  1. Once: Use -n 1, this ensures that the command is invoked exactly once for each argument. However, by default, xargs assumes that arguments are space-delimited — this will break once the files contain spaces.
  2. Per line: Use -d 'n', or preprocess the input with tr 'n' '' and use -0 . This makes the command robust against spaces in the input.

The final command line then becomes:

.... | xargs -n 1 -d 'n' <command>

or (with tr)

.... | tr 'n' '' | xargs -n 1 -0 <command>

If your command can process multiple arguments at once (like grep or sed), you can omit -n 1 to speed up the operation in many cases.

Answered By: krlmlr

This should work for everything,

  • including self-defined functions,
  • without spawning any additional processes, and
  • without removing spaces (which read otherwise does!).

Note the IFS= and -r, not included in any other answer:

mapp() { while IFS= read -r line; do "$1" "$line"; done; }

Here’s an example usage:

$ bla() { echo "  bla: $1"; }
$ echo -e "1n2n3n4" | mapp bla 
  bla: 1
  bla: 2
  bla: 3
  bla: 4

Here’s an alternative implementation that allows passing multiple arguments instead:

mapa() { f="$1"; shift; for x in "$@"; do "$f" "$x"; done; }


$ bla() { echo "  bla: $1"; }
$ mapa bla 1 2 3 4
  bla: 1
  bla: 2
  bla: 3
  bla: 4

Finally, some universal ones:

# Doesn’t care where the input comes from. (First parameters, then pipe lines.)
map() { f="$1"; shift
  if [[ $# -gt 0 ]]; then for x in "$@"; do "$f" "$x"; done; fi
  if [ ! -t 0 ] ; then while IFS= read -r line; do "$f" "$line"; done; fi
# Cross product
# (The parameters are the things to call,
#  the pipe takes values to call those things with.)
cross() {
  while IFS= read -r x; do
    for f in "$@"; do
      "$f" "$x"


$ bla() { echo "bla: $1"; }
$ blubb() { echo "blubb: $1"; }
$ gnah() { echo "gnah: $1"; }

$ echo -e "xnynz" | map bla 1 2 3
bla: 1
bla: 2
bla: 3
bla: x
bla: y
bla: z

$ echo -e "1n2n3n4" | cross bla blubb gnah
bla: 1
blubb: 1
gnah: 1
bla: 2
blubb: 2
gnah: 2
bla: 3
blubb: 3
gnah: 3
bla: 4
blubb: 4
gnah: 4
Answered By: Evi1M4chine
Categories: Answers Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.