gunzip a folder with many files
I have a folder with 36,348 files gz files. I want to unzip all of them.
Running:
gunzip ./*
results in
-bash: /usr/bin/gunzip: Argument list too long
What’s the easiest way to get around this?
Try:
find . -type f -exec gunzip {} +
This assumes that current directory only contains files that you want to unzip.
A bit efficient solution would be,
find $PWD -type f -name "*.gz" -print0 | xargs -0 gunzip
But if your find does not have -print0 option and your files/dir does not have space in naming, then you can skip -print0 and -0 arguments.
The limit is in the kernel and is on the cumulative size of the arguments and environment passed to the execve()
system call used to execute a command. You can split it in several invocations.
To gunzip
all the .gz
files in the current directory:
with zsh
:
autoload zargs # best in ~/.zshrc
zargs ./*.gz -- gunzip
With ksh93
:
command -x gunzip ./*.gz
GNUly:
printf '%s ' *.gz | xargs -r0 gunzip
POSIXly (and with gunzip
):
find . ! -name . -prune -name '*.gz' -exec gunzip {} +
(beware that one will also uncompress the hidden .gz
files)
Or you can raise that limit on some systems. On recent versions of Linux:
ulimit -s unlimited
gunzip ./*.gz
A less efficient ( but long-run very flexible ) process is make a batch file:
ls | grep .gz | sed -e 's/^/gunzip /' | less
shows you what will happen so a simple typo doesn’t annihilate your system
ls | grep .gz | sed -e 's/^/gunzip /' | bash
does it right now
ls | grep .gz | sed -e 's/^/gunzip /' > unpack
at 02:00
bash unpack
<ctrl-d>
does it overnight
(note this example does not account for spaces in filenames.)
Simplest way I have found is:
gunzip -r .
If you want to gunzip the files to a different directory you can try:
$ find <path to gzip files> -type f -name "*.gz" -print |
xargs -I % sh -c 'echo "gunzip -c % > `basename % .gz`";
gunzip -c % > `basename % .gz`'
In the example above, the files are gunzipped to the directory where the command is run.