[linux] Recursively counting files in a Linux directory

How can I recursively count files in a Linux directory?

I found this:

find DIR_NAME -type f ¦ wc -l

But when I run this it returns the following error.

find: paths must precede expression: ¦

This question is related to linux

The answer is


Combining several of the answers here together, the most useful solution seems to be:

find . -maxdepth 1 -type d -print0 |
xargs -0 -I {} sh -c 'echo -e $(find "{}" -printf "\n" | wc -l) "{}"' |
sort -n

It can handle odd things like file names that include spaces parenthesis and even new lines. It also sorts the output by the number of files.

You can increase the number after -maxdepth to get sub directories counted too. Keep in mind that this can potentially take a long time, particularly if you have a highly nested directory structure in combination with a high -maxdepth number.


On my computer, rsync is a little bit faster than find | wc -l in the accepted answer:

$ rsync --stats --dry-run -ax /path/to/dir /tmp

Number of files: 173076
Number of files transferred: 150481
Total file size: 8414946241 bytes
Total transferred file size: 8414932602 bytes

The second line has the number of files, 150,481 in the above example. As a bonus you get the total size as well (in bytes).

Remarks:

  • the first line is a count of files, directories, symlinks, etc all together, that's why it is bigger than the second line.
  • the --dry-run (or -n for short) option is important to not actually transfer the files!
  • I used the -x option to "don't cross filesystem boundaries", which means if you execute it for / and you have external hard disks attached, it will only count the files on the root partition.

You can use

$ tree

after installing the tree package with

$ sudo apt-get install tree

(on a Debian / Mint / Ubuntu Linux machine).

The command shows not only the count of the files, but also the count of the directories, separately. The option -L can be used to specify the maximum display level (which, by default, is the maximum depth of the directory tree).

Hidden files can be included too by supplying the -a option .


If you want to avoid error cases, don't allow wc -l to see files with newlines (which it will count as 2+ files)

e.g. Consider a case where we have a single file with a single EOL character in it

> mkdir emptydir && cd emptydir
> touch $'file with EOL(\n) character in it'
> find -type f
./file with EOL(?) character in it
> find -type f | wc -l
2

Since at least gnu wc does not appear to have an option to read/count a null terminated list (except from a file), the easiest solution would just be to not pass it filenames, but a static output each time a file is found, e.g. in the same directory as above

> find -type f -exec printf '\n' \; | wc -l
1

Or if your find supports it

> find -type f -printf '\n' | wc -l
1 

tree $DIR_PATH | tail -1

Sample Output:

5309 directories, 2122 files


find -type f | wc -l

OR (If directory is current directory)

find . -type f | wc -l


If you want a breakdown of how many files are in each dir under your current dir:

for i in */ .*/ ; do 
    echo -n $i": " ; 
    (find "$i" -type f | wc -l) ; 
done

That can go all on one line, of course. The parenthesis clarify whose output wc -l is supposed to be watching (find $i -type f in this case).


For the current directory:

find -type f | wc -l

To determine how many files there are in the current directory, put in ls -1 | wc -l. This uses wc to do a count of the number of lines (-l) in the output of ls -1. It doesn't count dotfiles. Please note that ls -l (that's an "L" rather than a "1" as in the previous examples) which I used in previous versions of this HOWTO will actually give you a file count one greater than the actual count. Thanks to Kam Nejad for this point.

If you want to count only files and NOT include symbolic links (just an example of what else you could do), you could use ls -l | grep -v ^l | wc -l (that's an "L" not a "1" this time, we want a "long" listing here). grep checks for any line beginning with "l" (indicating a link), and discards that line (-v).

Relative speed: "ls -1 /usr/bin/ | wc -l" takes about 1.03 seconds on an unloaded 486SX25 (/usr/bin/ on this machine has 355 files). "ls -l /usr/bin/ | grep -v ^l | wc -l" takes about 1.19 seconds.

Source: http://www.tldp.org/HOWTO/Bash-Prompt-HOWTO/x700.html


There are many correct answers here. Here's another!

find . -type f | sort | uniq -w 10 -c

where . is the folder to look in and 10 is the number of characters by which to group the directory.


With bash:

Create an array of entries with ( ) and get the count with #.

FILES=(./*); echo ${#FILES[@]}

Ok that doesn't recursively count files but I wanted to show the simple option first. A common use case might be for creating rollover backups of a file. This will create logfile.1, logfile.2, logfile.3 etc.

CNT=(./logfile*); mv logfile logfile.${#CNT[@]}

Recursive count with bash 4+ globstar enabled (as mentioned by @tripleee)

FILES=(**/*); echo ${#FILES[@]}

To get the count of files recursively we can still use find in the same way.

FILES=(`find . -type f`); echo ${#FILES[@]}

This alternate approach with filtering for format counts all available grub kernel modules:

ls -l /boot/grub/*.mod | wc -l

This will work completely fine. Simple short. If you want to count the number of files present in a folder.

ls | wc -l

If you want to know how many files and sub-directories exist from the present working directory you can use this one-liner

find . -maxdepth 1 -type d -print0 | xargs -0 -I {} sh -c 'echo -e $(find {} | wc -l) {}' | sort -n

This will work in GNU flavour, and just omit the -e from the echo command for BSD linux (e.g. OSX).


Since filenames in UNIX may contain newlines (yes, newlines), wc -l might count too many files. I would print a dot for every file and then count the dots:

find DIR_NAME -type f -printf "." | wc -c

For directories with spaces in the name ... (based on various answers above) -- recursively print directory name with number of files within:

find . -mindepth 1 -type d -print0 | while IFS= read -r -d '' i ; do echo -n $i": " ; ls -p "$i" | grep -v / | wc -l ; done

Example (formatted for readability):

pwd
  /mnt/Vancouver/Programming/scripts/claws/corpus

ls -l
  total 8
  drwxr-xr-x 2 victoria victoria 4096 Mar 28 15:02 'Catabolism - Autophagy; Phagosomes; Mitophagy'
  drwxr-xr-x 3 victoria victoria 4096 Mar 29 16:04 'Catabolism - Lysosomes'

ls 'Catabolism - Autophagy; Phagosomes; Mitophagy'/ | wc -l
  138

## 2 dir (one with 28 files; other with 1 file):
ls 'Catabolism - Lysosomes'/ | wc -l
  29

The directory structure is better visualized using tree:

tree -L 3 -F .
  .
  +-- Catabolism - Autophagy; Phagosomes; Mitophagy/
  ¦   +-- 1
  ¦   +-- 10
  ¦   +-- [ ... SNIP! (138 files, total) ... ]
  ¦   +-- 98
  ¦   +-- 99
  +-- Catabolism - Lysosomes/
      +-- 1
      +-- 10
      +-- [ ... SNIP! (28 files, total) ... ]
      +-- 8
      +-- 9
      +-- aaa/
          +-- bbb

  3 directories, 167 files

man find | grep mindep
  -mindepth levels
    Do not apply any tests or actions at levels less than levels
    (a non-negative integer).  -mindepth 1 means process all files
    except the starting-points.

ls -p | grep -v / (used below) is from answer 2 at https://unix.stackexchange.com/questions/48492/list-only-regular-files-but-not-directories-in-current-directory

find . -mindepth 1 -type d -print0 | while IFS= read -r -d '' i ; do echo -n $i": " ; ls -p "$i" | grep -v / | wc -l ; done
./Catabolism - Autophagy; Phagosomes; Mitophagy: 138
./Catabolism - Lysosomes: 28
./Catabolism - Lysosomes/aaa: 1

Applcation: I want to find the max number of files among several hundred directories (all depth = 1) [output below again formatted for readability]:

date; pwd
    Fri Mar 29 20:08:08 PDT 2019
    /home/victoria/Mail/2_RESEARCH - NEWS

time find . -mindepth 1 -type d -print0 | while IFS= read -r -d '' i ; do echo -n $i": " ; ls -p "$i" | grep -v / | wc -l ; done > ../../aaa
    0:00.03

[victoria@victoria 2_RESEARCH - NEWS]$ head -n5 ../../aaa
    ./RNA - Exosomes: 26
    ./Cellular Signaling - Receptors: 213
    ./Catabolism - Autophagy; Phagosomes; Mitophagy: 138
    ./Stress - Physiological, Cellular - General: 261
    ./Ancient DNA; Ancient Protein: 34

[victoria@victoria 2_RESEARCH - NEWS]$ sed -r 's/(^.*): ([0-9]{1,8}$)/\2: \1/g' ../../aaa | sort -V | (head; echo ''; tail)

    0: ./Genomics - Gene Drive
    1: ./Causality; Causal Relationships
    1: ./Cloning
    1: ./GenMAPP 2
    1: ./Pathway Interaction Database
    1: ./Wasps
    2: ./Cellular Signaling - Ras-MAPK Pathway
    2: ./Cell Death - Ferroptosis
    2: ./Diet - Apples
    2: ./Environment - Waste Management

    988: ./Genomics - PPM (Personalized & Precision Medicine)
    1113: ./Microbes - Pathogens, Parasites
    1418: ./Health - Female
    1420: ./Immunity, Inflammation - General
    1522: ./Science, Research - Miscellaneous
    1797: ./Genomics
    1910: ./Neuroscience, Neurobiology
    2740: ./Genomics - Functional
    3943: ./Cancer
    4375: ./Health - Disease 

sort -V is a natural sort. ... So, my max number of files in any of those (Claws Mail) directories is 4375 files. If I left-pad (https://stackoverflow.com/a/55409116/1904943) those filenames -- they are all named numerically, starting with 1, in each directory -- and pad to 5 total digits, I should be ok.


Addendum

Find the total number of files, subdirectories in a directory.

$ date; pwd
Tue 14 May 2019 04:08:31 PM PDT
/home/victoria/Mail/2_RESEARCH - NEWS

$ ls | head; echo; ls | tail
Acoustics
Ageing
Ageing - Calorie (Dietary) Restriction
Ageing - Senescence
Agriculture, Aquaculture, Fisheries
Ancient DNA; Ancient Protein
Anthropology, Archaeology
Ants
Archaeology
ARO-Relevant Literature, News

Transcriptome - CAGE
Transcriptome - FISSEQ
Transcriptome - RNA-seq
Translational Science, Medicine
Transposons
USACEHR-Relevant Literature
Vaccines
Vision, Eyes, Sight
Wasps
Women in Science, Medicine

$ find . -type f | wc -l
70214    ## files

$ find . -type d | wc -l
417      ## subdirectories

You can use the command ncdu. It will recursively count how many files a Linux directory contains. Here is an example of output:

enter image description here

It has a progress bar, which is convenient if you have many files:

enter image description here

To install it on Ubuntu:

sudo apt-get install -y ncdu

Benchmark: I used https://archive.org/details/cv_corpus_v1.tar (380390 files, 11 GB) as the folder where one has to count the number of files.

  • find . -type f | wc -l: around 1m20s to complete
  • ncdu: around 1m20s to complete

I have written ffcnt to speed up recursive file counting under specific circumstances: rotational disks and filesystems that support extent mapping.

It can be an order of magnitude faster than ls or find based approaches, but YMMV.


If what you need is to count a specific file type recursively, you can do:

find YOUR_PATH -name '*.html' -type f | wc -l 

-l is just to display the number of lines in the output.

If you need to exclude certain folders, use -not -path

find . -not -path './node_modules/*' -name '*.js' -type f | wc -l

ls -l | grep -e -x -e -dr | wc -l 
  1. long list
  2. filter files and dirs
  3. count the filtered line no