`find: fts_read: Invalid argument` when working with around 8000 files
1
vote
1
answer
804
views
GNU bash, version 3.2.48 has this bug; already version 3.2.57 does not.
Make a file with 8000 identical lines (say each line says
1
). Run split -a3 -p "1"
on it (-p
is a BSD split
option which makes it split on the given pattern. For a file with just one 1
per line, you can do the same thing with a standard split
by running split -a3 -b 1
). And execute
find . -name xaaa -exec echo {} +
And after the expected output, you get find: fts_read: Invalid argument
output to cerr
. The same error occurs when xaaa
is replaced by any set of files, and echo
by any other command I've tried. The length of the filename doesn't matter. The directory of the files also doesn't matter.
After some creating files elsewhere, the error is gone. However, when xaaa
is replaced by xaa*
(or any other wildcard that includes multiple files, at least one of which is near the beginning of the directory listing), then the error occurs again. At that point, no single file causes the error to appear.
Replacing +
with ;
does avoid the error, but is not acceptable for my script. This problem has been occurring intermittently in other situations in my script, but by reducing it I was able to come up with a simple way of replicating it.
I want the script to stop if an error occurs, but this just makes it stop very often. Any idea how to get around this? (e.g. retrieve an error code and ignore just this specific error).
Mac OS version 10.8.5. Darwin Kernel Version 12.5.0.
Asked by Alex
(1220 rep)
Oct 25, 2023, 03:28 AM
Last activity: Oct 25, 2023, 07:29 PM
Last activity: Oct 25, 2023, 07:29 PM