Compress a folder with tar?
I’m trying to compress a folder (
$time is the current date.
This is what I have:
cd /var/www && sudo tar -czf ~/www_backups $time"
I am completely lost and I’ve been at this for hours now. Not sure if
-czf is correct. I simply want to copy all of the content in
/var/www into a
$time.tar file, and I want to maintain the file permissions for all of the files. Can anyone help me out?
gzip a folder, the syntax is:
tar czf name_of_archive_file.tar.gz name_of_directory_to_tar
- before the options (
czf) is optional with
tar. The effect of
czf is as follows:
c— create an archive file (as opposed to extract, which is
f— filename of the archive file
z— filter archive through
gzip(remove this option to create a
If you want to
tar the current directory, use
. to designate that.
To construct filenames dynamically, use the
date utility (look at its man page for the available format options). For example:
cd /var/www && tar czf ~/www_backups/$(date +%Y%m%d-%H%M%S).tar.gz .
This will create a file named something like
On Linux, chances are your
tar also supports BZip2 compression with the
j rather than
z option. And possibly others. Check the man page on your local system.
Examples for Most Common Compression Algorithms
The question title for this is simply "Compress a folder with tar?" Since this title is very general, but the question and answer are much more specific, and due to the very large number of views this question has attracted, I felt it would be beneficial to add an up-to-date list of examples of both archiving/compressing and extracting/uncompressing, with various commonly used compression algorithms.
These have been tested with Ubuntu 18.04.4. They are very simple for general use, but could easily be integrated into the OP’s more specific question contents using the the techniques in the accepted answer and helpful comments above.
One thing to note for the more general audience is that
tar will not add the necessary extensions (like
.tar.gz) automatically – the user has to explicitly add those, as seen in the commands below:
# 1: tar (create uncompressed archive) all files and directories in the current working directory recursively into an uncompressed tarball tar cvf filename.tar * # 2: Untar (extract uncompressed archive) all files and directories in an uncompressed tarball recursively into the current working directory tar xvf filename.tar # 3: tar (create gzipped archive) all files and directories in the current working directory recursively into a tarball compressed with gzip tar cvzf filename.tar.gz * # 4: Untar (extract gzipped archive) all files and directories in a tarball compressed with gzip recursively into the current working directory tar xvf filename.tar.gz # Note: same options as 2 above # 5: tar (create bzip2'ed archive) all files and directories in the current working directory recursively into a tarball compressed with bzip2 tar cvjf filename.tar.bz2 * # Note: little 'j' in options # 6: Untar (extract bzip2'ed archive) all files and directories in an tarball compressed with bzip2 recursively into the current working directory tar xvf filename.tar.bz2 # Note: same options as 2 and 4 above # 7: tar (create xz'ed archive) all files and directories in the current working directory recursively into a tarball compressed with xz tar cvJf filename.tar.xz * # Note: capital 'J' in options # 8: Untar (extract xz'ed archive) all files and directories in an tarball compressed with xz recursively into the current working directory tar xvf filename.tar.xz # Note: same options as 2, 4, and 6 above
See the tar man page (best to use
man tar on your specific machine) for further details. Below I summarize the options used above directly from the man page:
create a new archive
-x, –extract, –get
extract files from an archive
verbosely list files processed
filter the archive through gzip
filter the archive through bzip2
filter the archive through xz
use archive file or device ARCHIVE
No need to add the
- in front of the combined options, or the
= sign between the
f option and the filename.
I got all this from my recent article, which will be expanded further into a much more comprehensive article as I have time to work on it.