How can I get a count of files in a directory using the command line?

I have a directory with a large number of files. I don’t see a ls switch to provide the count. Is there some command line magic to get a count of files?

Asked By: Blake


Using a broad definition of “file”

ls | wc -l

(note that it doesn’t count hidden files and assumes that file names don’t contain newline characters).

To include hidden files (except . and ..) and avoid problems with newline characters, the canonical way is:

find . ! -name . -prune -print | grep -c /

Or recursively:

find .//. ! -name . -print | grep -c //
Answered By: James

For narrow definition of file:

 find . -maxdepth 1 -type f | wc -l
Answered By: Maciej Piechotka

If you know the current directory contains at least one non-hidden file:

set -- *; echo "$#"

This is obviously generalizable to any glob.

In a script, this has the sometimes unfortunate side effect of overwriting the positional parameters. You can work around that by using a subshell or with a function (Bourne/POSIX version) like:

count_words () {
  eval 'shift; '"$1"'=$#'
count_words number_of_files *
echo "There are $number_of_files non-dot files in the current directory"

An alternative solution is $(ls -d -- * | wc -l). If the glob is *, the command can be shortened to $(ls | wc -l). Parsing the output of ls always makes me uneasy, but here it should work as long as your file names don’t contain newlines, or your ls escapes them. And $(ls -d -- * 2>/dev/null | wc -l) has the advantage of handling the case of a non-matching glob gracefully (i.e., it returns 0 in that case, whereas the set * method requires fiddly testing if the glob might be empty).

If file names may contain newline characters, an alternative is to use $(ls -d ./* | grep -c /).

Any of those solutions that rely on passing the expansion of a glob to ls may fail with a argument list too long error if there are a lot of matching files.

ls -1 | wc -l

$ ls --help | grep -- '  -1'
    -1                         list one file per line

$ wc --help | grep -- '  -l'
    -l, --lines            print the newline counts

PS: Note ls -<number-one> | wc -<letter-l>

Answered By: nicomen

Here’s another technique along the lines of the one Gilles posted:

word_count () { local c=("$@"); echo "${#c[@]}"; }
file_count=$(word_count *)

which creates an array with 13,923 elements (if that’s how many files there are).

Answered By: Dennis Williamson

After installing the tree command, just type:


If you want hidden files too:

tree -a

If you are using Debian / Mint / Ubuntu Linux, type the following command to install the tree command:

sudo apt-get install tree

The option -L is used for specifying the maximum display level of the directory tree. The tree command does not only count the number of files, but also the number of directories, considering as many levels of the directory tree as you like.

Answered By: lev

No pipe, no string copy, no fork, just plain bash one liner

$ fcount() { local f i=0; for f in *; do let i++; done; echo $i; }; fcount
Answered By: DaboD

Probably the most complete answer using ls/wc pair is

ls -Aq | wc -l

if you want to count dot files, and

ls -q | wc -l


  • -A is to count dot files, but omit . and ...
  • -q make ls replace nongraphic characters, specifically newline character, with ?, making output 1 line for each file

To get one-line output from ls in terminal (i.e. without piping it into wc), -1 option has to be added.

(behaviour of ls tested with coreutils 8.23)

Answered By: Frax

Use the tree command, just:

Answered By: user115186

While using ls/wc pair if we are adding -U it will be much faster (do not sort ).

ls -AqU | wc -l
Answered By: Jbnair

Try this i hope this answer will help you

echo $((`ls -l | wc -l` -1 ))
Answered By: Tiger

You can check with:

ls -l | grep -v ^l | wc -l
Answered By: AMIC MING
find . -type f -maxdepth 1 |  wc -l 

This can list only the files in current directory.

Answered By: srpatch

With the GNU implementation of find:

find -maxdepth 1 -type f -printf . | wc -c
  • -maxdepth 1 will make it non-recursive, find is recursive by default
  • -type f will include regular files only
  • -printf . is a cute touch. it prints a dot (a single-byte character in every locale) for each file instead of the filename, and now this is able to handle any filename and also saves data; we just have to count the dots :). Note however that -printf is a GNU-only extension.
  • | wc -c counts bytes and reports the total as a decimal integer (possibly preceded and/or followed by whitespace depending on the wc implementation, not with GNU wc)
Answered By: aude

Improving some answers given before but this time doing explicitly.

$ tree -L 1 | tail -n 1 | cut -d " " -f 3

It’s worthy to notice the use of some loved commands like tail and cut.
Also, note that tree is not available by default. The command above first capture information about the directory at level 1, then get the last line tail -n 1 where our goal is, and end up with cut to take the third word.

For instance, locating in /:

/ $ tree -L 1
├── 1
├── bin -> usr/bin
├── boot
├── dev
├── etc
├── home
├── lib -> usr/lib
├── lib64 -> usr/lib64
├── lost+found
├── media
├── mnt
├── opt
├── proc
├── root
├── run
├── sbin -> usr/sbin
├── srv
├── sys
├── tmp
├── usr
└── var

20 directories, 1 file
/ $ tree -L 1 | tail -n 1
20 directories, 1 file
/ $ tree -L 1 | tail -n 1 | cut -d " " -f 3

Then, what about asking the number of directories?

I have found du --inodes useful, but I’m not sure which version of du it requires. It should be substantially faster than alternative approaches using find and wc.

On Ubuntu 17.10, the following works:

du --inodes      # all files and subdirectories
du --inodes -s   # summary
du --inodes -d 2 # depth 2 at most

Combine with | sort -nr to sort descending by number of containing inodes.

Answered By: krlmlr

On Linux, to make the command very robust and handle files that might have newlines in their name, use this:

find -maxdepth 1 -type f -print0 | tr -cd '' | wc -c

This saves us from the ordeal of parsing ls output.


Answered By: codeforester

If you have rights to install packages, there is a very simple tool to do this (and more). It is called ncdu and it can be installed using apt or yum. A basic usage of ncdu would be:

ncdu /path/to/dir

This will display an ncurses-based screen which you can navigate using cursor keys. At the bottom, initially you will see the total number of files in that directory and subdirectories. Using the up/down arrow keys and ENTER, you can quickly navigate to any directory and get stats on usage.

A slightly advance use is ncdu -x /path/to/dir which will count only those files and directories which are on the same filesystem as the directory being scanned.

A bonus of ncdu is that it gives a progress bar while scanning. You can also redirect the output to a file for later use.

In the man page, there is an interesting section on how hard links are handled across various versions of ncdu.


Answered By: Hopping Bunny

I use this one, few examples:

ls -Ap    directory01/directory02  | grep -v /$  | wc -l
ls -Ap    directory01/directory02/exampl*  | grep -v /$  | wc -l
ls -Ap    /home/directory01/directory02  | grep -v /$  | wc -l

It works like this:

  • -p with ls adds / at the end of the directory names.
  • -A with ls lists all the files and directories, including hidden files but excluding . and .. directories.
  • grep -v /$ only shows the lines that do not match (-v option) lines that end with /. (directories)
  • wc -l counts the number of lines.

Either I use for example mix of these:

ls -Ap    directory01/directory02  | grep -v /$  | wc -l ; /
ls -Ap    directory01/directory02/exampl*  | grep -v /$  | wc -l; /
ls -Ap    /home/directory01/directory02  | grep -v /$  | wc -l

So, for example from the:

$ tree
├── directory01
│   ├── directory02
│   ├── directory03
│       ├── Screenshot from 2022-04-19 15-12-55.png
│       └── Screenshot from 2022-04-19 16-05-05.png
│       └── directory04

I will get count [plain files]

$ ls -Ap    directory01/directory03  | grep -v /$  | wc -l 

It will be counted both

│       ├── Screenshot from 2022-04-19 15-12-55.png
│       └── Screenshot from 2022-04-19 16-05-05.png

but no [not a plain file]

Answered By: user14927127

With some shells, you can do that without relying on external utilities:


count *

Or to include hidden files:

count * .*


Define a function, here called count to mimic fish‘s builtin:

count() print $#

Then call the function with:

count *(N)

Or use an anonymous function:

(){print $#} *(N)

To include hidden (Dot) files:

(){print $#} *(ND)

You could also add a oN glob qualifier to disable sorting,
or add ., / or @
to count only regular files, directories or symlinks

Replace * with **/* to also count files in sub-directories recursively.


count() print "$#"
count ~(N)*

To include hidden files:

count ~(N){.,.[!.],..?}*


(FIGNORE=.:..; count ~(N)*)


count() { echo "$#"; }
(shopt -s nullglob; shopt -u failglob; count *)

To include hidden files:

(shopt -s nullglob dotglob; shopt -u failglob; count *)
Answered By: Stéphane Chazelas

To build on James’ answer, you can write something like this to achieve a breakdown of all the direct subdirectories.

tree -aFid -L 1 . | while read f; do if [[ -d "${f}" ]]; then echo -en "${f}: "; find "${f}"//. ! -name . -print | grep -c //; fi; done


.: 884
documents: 46
photos: 300
videos: 138

I’ve just managed to quickly find the one directory that contained missing files after a file transfer that way.

Answered By: WoodrowShigeru
Categories: Answers Tags:
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.