It's a kernel limitation on the size of the command line argument. Use a for
loop instead.
This is a system issue, related to execve
and ARG_MAX
constant. There is plenty of documentation about that (see man execve, debian's wiki).
Basically, the expansion produce a command (with its parameters) that exceeds the ARG_MAX
limit.
On kernel 2.6.23
, the limit was set at 128 kB
. This constant has been increased and you can get its value by executing:
getconf ARG_MAX
# 2097152 # on 3.5.0-40-generic
for
LoopUse a for
loop as it's recommended on BashFAQ/095 and there is no limit except for RAM/memory space:
Dry run to ascertain it will delete what you expect:
for f in *.pdf; do echo rm "$f"; done
And execute it:
for f in *.pdf; do rm "$f"; done
Also this is a portable approach as glob have strong and consistant behavior among shells (part of POSIX spec).
Note: As noted by several comments, this is indeed slower but more maintainable as it can adapt more complex scenarios, e.g. where one want to do more than just one action.
find
If you insist, you can use find
but really don't use xargs as it "is dangerous (broken, exploitable, etc.) when reading non-NUL-delimited input":
find . -maxdepth 1 -name '*.pdf' -delete
Using -maxdepth 1 ... -delete
instead of -exec rm {} +
allows find
to simply execute the required system calls itself without using an external process, hence faster (thanks to @chepner comment).