Sample Header Ad - 728x90

Unix & Linux Stack Exchange

Q&A for users of Linux, FreeBSD and other Unix-like operating systems

Latest Questions

10 votes
4 answers
1073 views
Using `find` to find a file in PATH
I would like to be able to search all my `$PATH` for files matching a given pattern. For example, if my PATH is `/usr/local/bin:/usr/bin:/bin` and there's a `/usr/local/bin/gcc-4` and a `/usr/bin/gcc-12`, I would like to be able to search for `gcc-*` to find them both. The trivial approach of course...
I would like to be able to search all my $PATH for files matching a given pattern. For example, if my PATH is /usr/local/bin:/usr/bin:/bin and there's a /usr/local/bin/gcc-4 and a /usr/bin/gcc-12, I would like to be able to search for gcc-* to find them both. The trivial approach of course does not work:
find ${PATH} -name "gcc-*"
this naive approach does work:
find $(echo "${PATH}" | sed -e 's|:| |g') -name "gcc-*"
but of course this is breaks if PATH holds any weird characters like space and the like. So how can I achieve this in a safe way? My shell is sh.
umläute (6704 rep)
Jul 30, 2025, 12:35 PM • Last activity: Aug 3, 2025, 05:36 PM
2 votes
3 answers
2809 views
Parallelize recursive deletion with find
I want to recursively delete all files that end with `.in`. This is taking a long time, and I have many cores available, so I would like to parallelize this process. From [this thread][1], it looks like it's possible to use `xargs` or `make` to parallelize `find`. Is this application of find possibl...
I want to recursively delete all files that end with .in. This is taking a long time, and I have many cores available, so I would like to parallelize this process. From this thread , it looks like it's possible to use xargs or make to parallelize find. Is this application of find possible to parallelize? Here is my current serial command: find . -name "*.in" -type f -delete
kilojoules (169 rep)
Mar 6, 2017, 04:57 PM • Last activity: Jul 26, 2025, 03:16 PM
6 votes
1 answers
361 views
About security concerns using in find -exec vs interactively performing the same task
My question is not aimed specifically at understanding `find`'s and its `-exec` option's security implications, but more generally at understanding why (if at all!) such programs are particularly exploitable with respect to an interactive approach to accomplish the same task. For instance, [here's a...
My question is not aimed specifically at understanding find's and its -exec option's security implications, but more generally at understanding why (if at all!) such programs are particularly exploitable with respect to an interactive approach to accomplish the same task. For instance, [here's an example of a command that is described as risky](https://www.gnu.org/software/findutils/manual/html_node/find_html/Race-Conditions-with-_002dexec.html) : >
> find /tmp -path /tmp/umsp/passwd -exec /bin/rm
>
> In this simple example, we are identifying just one file to be deleted and invoking /bin/rm to delete it. A problem exists because there is a time gap between the point where find decides that it needs to process the ‘-exec’ action and the point where the /bin/rm command actually issues the unlink() system call to delete the file from the filesystem. Within this time period, an attacker can rename the /tmp/umsp directory, replacing it with a symbolic link to /etc. There is no way for /bin/rm to determine that it is working on the same file that find had in mind. Once the symbolic link is in place, the attacker has persuaded find to cause the deletion of the /etc/passwd file, which is not the effect intended by the command which was actually invoked. Now, first and foremost, I don't understand why the "time gap" is even necessary for the security issue to exist; I mean, _an attacker can rename the /tmp/umsp directory, replacing it with a symbolic link to /etc_ even before find runs at all, wouldn't the result be the same? Secondly, since the intent of the command is to delete /tmp/umsp/passwd, I could have done /bin/rm /tmp/umsp/passwd. Wouldn't I incur the same risk of an attacker having performed the aforementioned substitution before I run this command, possibly right after I've cat /tmp/umsp/passwd to makes sure it's the one I want to delete?
Enlico (2258 rep)
Jul 25, 2025, 08:32 AM • Last activity: Jul 25, 2025, 09:58 PM
-1 votes
3 answers
120 views
zgrep - Find an IP address in 1200 *.gz files
I want to find out in which of my 1200 *.gz files a certain IP address occurs on a certain date (17.07.2025). I could not find the IP address in my current logs: grep 'IP address' *.logs My attempt to see if the IP even exists in the older, zipped log files returns: 'The argument list is too long'....
I want to find out in which of my 1200 *.gz files a certain IP address occurs on a certain date (17.07.2025). I could not find the IP address in my current logs: grep 'IP address' *.logs My attempt to see if the IP even exists in the older, zipped log files returns: 'The argument list is too long'. zgrep 'IP address' * A combination of find and grep was also unsuccessful. I am grateful for any assistance.
fl1tz (1 rep)
Jul 21, 2025, 04:18 PM • Last activity: Jul 25, 2025, 06:22 AM
2 votes
2 answers
2351 views
How to sort the results of `find` alphabetically and topologically?
Let's say I have the following files laid out like this: ``` $ tree . ├── 01/ │   ├── example.txt │   └── foobar.txt ├── 02/ │   └── example.txt └── 03/ ├── example.txt └── test.txt 3 directories, 5 files ``` I would like `find .` to print out the directories and files...
Let's say I have the following files laid out like this:
$ tree
.
├── 01/
│   ├── example.txt
│   └── foobar.txt
├── 02/
│   └── example.txt
└── 03/
    ├── example.txt
    └── test.txt

3 directories, 5 files
I would like find . to print out the directories and files in order, topologically (that is directory names before their contents), and within the same directory, alphabetically. Instead, these are the results I get:
$ find .
.
./02
./02/example.txt
./01
./01/foobar.txt
./01/example.txt
./03
./03/test.txt
./03/example.txt
I would like the output to look like:
.
./01
./01/foobar.txt
./01/example.txt
./02
./02/example.txt
./03
./03/example.txt
./03/test.txt
I don't want to pipe to sort or to any other command, because I would like to process the lines one by one in order before the find command completes.
Flimm (4473 rep)
Aug 23, 2021, 03:04 PM • Last activity: Jul 19, 2025, 08:07 AM
1 votes
1 answers
1450 views
Creating an alias for the find command in Linux
I have created the following alias in my `.bashrc` file: alias find='find . -type f -name' This obviates the need to type `. -type f -name` every time I do a file search. However, I still have to enclose search strings with '\*...*'. How could I include these in the alias, so that instead of having...
I have created the following alias in my .bashrc file: alias find='find . -type f -name' This obviates the need to type . -type f -name every time I do a file search. However, I still have to enclose search strings with '\*...*'. How could I include these in the alias, so that instead of having to type: find '*string*' I could just type, find string
edman (588 rep)
Jul 27, 2021, 04:57 PM • Last activity: Jun 29, 2025, 06:27 AM
65 votes
1 answers
148079 views
find & sed (search and replace)
I'm using the following command on my mac: $find . -name “*.java” -exec sed -i ’s/foo/bar/g’ {} \; and it seems to have no effect. I have two files in the directory that end in .java, which both have the foo text in them. Am I missing something? **EDIT : Results from request of comments** [aafghani-...
I'm using the following command on my mac: $find . -name “*.java” -exec sed -i ’s/foo/bar/g’ {} \; and it seems to have no effect. I have two files in the directory that end in .java, which both have the foo text in them. Am I missing something? **EDIT : Results from request of comments** [aafghani-03:~/test amirafghani]$ find . -name "*.java" -exec sed -i 's/foo/bar/g' {} \; sed: 1: "./bar.java": invalid command code . sed: 1: "./foo.java": invalid command code .
Amir Afghani (7373 rep)
Apr 18, 2012, 06:59 PM • Last activity: Jun 28, 2025, 10:35 AM
0 votes
1 answers
2149 views
how to find all pics in a directory and its subdirectories and run a command on them
how to find all images in a directory and its sub-directories, with types ```gif, jpg, jpeg, png, ico```, and run ```mogrify -strip your_filename.jpg``` on them? Can the command ```mogrify -strip``` corrupt an image? Also can the command ```mogrify -strip``` be run on a gif file?
how to find all images in a directory and its sub-directories, with types
, jpg, jpeg, png, ico
, and run
-strip your_filename.jpg
on them? Can the command
-strip
corrupt an image? Also can the command
-strip
be run on a gif file?
stacking and exchanging woohoo (63 rep)
Dec 13, 2022, 10:47 PM • Last activity: Jun 24, 2025, 02:04 PM
2 votes
3 answers
354 views
Create Subfolders for files and move them into, each for themselves
In Linux in a Folder there are these files: LQ0gfKQej7GKG44Cn0sSAC.part01.rar LQ0gfKQej7GKG44Cn0sSAC.part02.rar LQ0gfKQej7GKG44Cn0sSAC.part03.rar LQ0gfKQej7GKG44Cn0sSAC.part04.rar LQ0gfKQej7GKG44Cn0sSAC.part05.rar LQ0gfKQej7GKG44Cn0sSAC.part06.rar LQ0gfKQej7GKG44Cn0sSAC.part07.rar LQ0gfKQej7GKG44Cn0...
In Linux in a Folder there are these files: LQ0gfKQej7GKG44Cn0sSAC.part01.rar LQ0gfKQej7GKG44Cn0sSAC.part02.rar LQ0gfKQej7GKG44Cn0sSAC.part03.rar LQ0gfKQej7GKG44Cn0sSAC.part04.rar LQ0gfKQej7GKG44Cn0sSAC.part05.rar LQ0gfKQej7GKG44Cn0sSAC.part06.rar LQ0gfKQej7GKG44Cn0sSAC.part07.rar LQ0gfKQej7GKG44Cn0sSAC.part08.rar LQ0gfKQej7GKG44Cn0sSAC.part09.rar LQ0gfKQej7GKG44Cn0sSAC.part10.rar no9e1hvqlVEbYHs9YU3.part1.rar no9e1hvqlVEbYHs9YU3.part2.rar NwgdNaort1EqT0ch.part1.rar NwgdNaort1EqT0ch.part2.rar NwgdNaort1EqT0ch.part3.rar NwgdNaort1EqT0ch.part4.rar VEwMzBPH91J.part1.rar VEwMzBPH91J.part2.rar I want to create a folder with the name LQ0gfKQej7GKG44Cn0sSAC In this Folder I want to move all LQ0gfKQej7GKG44Cn0sSAC.part... files. For no9e1hvqlVEbYHs9YU3,NwgdNaort1EqT0ch and VEwMzBPH91J the same way too. To get a list with the names I do this:
ls -1 * | cut -f1 -d "." | uniq > ../names
For creating the Subfolders I use:
cat ../names | xargs mkdir
I then tried then something like:
cat ../names | xargs -I '{}' mv '{}*' {}
But I think I am not on the right way.... It does not work.
Banana (189 rep)
Jun 22, 2025, 12:12 AM • Last activity: Jun 23, 2025, 09:53 AM
91 votes
6 answers
209171 views
Bash: How to read one line at a time from output of a command?
I am trying to read the output of a command in bash using a `while loop`. while read -r line do echo "$line" done <<< $(find . -type f) The output I got ranveer@ranveer:~/tmp$ bash test.sh ./test.py ./test1.py ./out1 ./test.sh ./out ./out2 ./hello ranveer@ranveer:~/tmp$ After this I tried $(find . -...
I am trying to read the output of a command in bash using a while loop. while read -r line do echo "$line" done <<< $(find . -type f) The output I got ranveer@ranveer:~/tmp$ bash test.sh ./test.py ./test1.py ./out1 ./test.sh ./out ./out2 ./hello ranveer@ranveer:~/tmp$ After this I tried $(find . -type f) | while read -r line do echo "$line" done but it generated an error test.sh: line 5: ./test.py: Permission denied. So, how do I read it line by line because I think currently it is slurping the entire line at once. Required output: ./test.py ./test1.py ./out1 ./test.sh ./out ./out2 ./hello
RanRag (6035 rep)
Oct 16, 2012, 08:35 PM • Last activity: Jun 20, 2025, 06:36 AM
6 votes
1 answers
703 views
How to find files by size that are not divisible by 4096 and round them up
In Linux in a Folder there are many files, all created with `fallocate` with random size. How to find files whose size is not divisible by 4096 and correct the filesize (rounded up) to a multiple of 4096? They can be found with : find . -type f -printf '%s\t%p\n' | awk -F'\t' '{size=$1; file=$2; if...
In Linux in a Folder there are many files, all created with fallocate with random size. How to find files whose size is not divisible by 4096 and correct the filesize (rounded up) to a multiple of 4096? They can be found with : find . -type f -printf '%s\t%p\n' | awk -F'\t' '{size=$1; file=$2; if (size % 4096 != 0) print file}' But how to size them up that the filesize is divisible by 4096 ?
Banana (189 rep)
Jun 16, 2025, 01:34 AM • Last activity: Jun 16, 2025, 11:29 PM
0 votes
1 answers
55 views
Why does `find` have the tests `-anewer` and `-cnewer`, but not `-mnewer`?
(This is more a question out of curiosity and less a present problem that needs to be solved) I recently learned more about time-related tests in GNU find, and something caught my eyes: With files, you have three types of timestamps: `access` for when a file was last accessed, `modify` for when the...
(This is more a question out of curiosity and less a present problem that needs to be solved) I recently learned more about time-related tests in GNU find, and something caught my eyes: With files, you have three types of timestamps: access for when a file was last accessed, modify for when the file's content was last modified, and change for when the file's inode was last modified (by renaming the file, moving it to another directory etc.). So accordingly, there are three different types of time-related tests in find. For example, you have -atime, -mtime and -ctime. You also have -anewer and -cnewer, but as far as I can tell there's no such thing as -mnewer. Is there a specific reason why there is no -mnewer test? Has it just not been implemented yet? Did it exist once and has been removed? I'm basically just curious. I've looked into this with a fairly recent version of find, version 4.9.0.
Henning Kockerbeck (111 rep)
Jun 16, 2025, 09:09 AM • Last activity: Jun 16, 2025, 09:13 AM
-4 votes
1 answers
85 views
How combination of find with sh solves problem with filenames
I was directed to this StackExchange post: [Why is looping over find's output bad practice?][1] [1]: https://unix.stackexchange.com/questions/321697/why-is-looping-over-finds-output-bad-practice The core issue is that Unix filenames can contain any character except the null byte (\0). This means the...
I was directed to this StackExchange post: Why is looping over find's output bad practice? The core issue is that Unix filenames can contain any character except the null byte (\0). This means there is no printable character you can reliably use as a delimiter when processing filenames - newlines, spaces, and tabs can all appear in valid filenames. A common solution is to use a null byte as the delimiter, which is supported by GNU find with the -print0 option. This allows you to safely process filenames using tools like xargs -0 or while read -d ''. However, not all versions of find (especially non-GNU variants) support -print0. This raises the question: What should you do for portability when -print0 isn't available? Frankly, it seems that find implementations lacking -print0 are fundamentally flawed for robust scripting and should not be relied upon in scripts that need to handle arbitrary filenames safely. There was the suggestion to use find in combination with sh find dirname ... -exec sh -c 'for f do somecommandwith "$f"; done' find-sh {} + How does this fix the problem? One is still using a defective find. The link did not provide a clear explanation of why the combination of find with sh should work. How does it solve the problem?
Filangieri (179 rep)
Jun 9, 2025, 05:15 PM • Last activity: Jun 9, 2025, 11:33 PM
2 votes
2 answers
98 views
find -not -name not working
Run command find /volume1/Ingest -not -name '*.txt' -mtime +1 -print and got the following output ``` /volume1/Ingest/test/Cube Emergency Lighting Locations Aug 2018.pdf /volume1/Ingest/Cube Microplex Live Recording info.pdf /volume1/Ingest/stuff over 7 days will be automatically deleted.txt ``` But...
Run command find /volume1/Ingest -not -name '*.txt' -mtime +1 -print and got the following output
/volume1/Ingest/test/Cube Emergency Lighting Locations Aug 2018.pdf
/volume1/Ingest/Cube Microplex Live Recording info.pdf
/volume1/Ingest/stuff over 7 days will be automatically deleted.txt
But last files matches *.txt, what's going on?
Ben Edwards (121 rep)
Jun 3, 2025, 02:45 PM • Last activity: Jun 7, 2025, 03:48 PM
8 votes
4 answers
12755 views
Equivalent maxdepth for find in AIX
I'm trying to get all files by mask in some directory without recursively searching in subdirs. There is no option `-maxdepth 0` in AIX for that. I've heard about `-prune`, but still can't get how it works. I guess the command should look something like find dir \( ! -name dir -prune -type f \) -a -...
I'm trying to get all files by mask in some directory without recursively searching in subdirs. There is no option -maxdepth 0 in AIX for that. I've heard about -prune, but still can't get how it works. I guess the command should look something like find dir \( ! -name dir -prune -type f \) -a -name filemask but it doesn't work. Could you please write a correct command for me and explain how it will work? **UPD** It seems command find dir ! -path dir -prune prints all files and catalogs in dir, but not files and catalogs in dir/*, so I can use it for my case.
Vikora (125 rep)
Jun 23, 2017, 03:04 PM • Last activity: Jun 5, 2025, 02:59 PM
32 votes
4 answers
29976 views
Find only the first few matched files using find
Say there are hundreds of `*.txt` files in a directory. I only want to find the first three `*.txt` files and then exit the searching process. How can I achieve this using the `find` utility? I had a quick look through its man pages but saw no option for this.
Say there are hundreds of *.txt files in a directory. I only want to find the first three *.txt files and then exit the searching process. How can I achieve this using the find utility? I had a quick look through its man pages but saw no option for this.
mitnk (581 rep)
Mar 19, 2013, 08:19 AM • Last activity: Jun 5, 2025, 12:25 PM
37 votes
1 answers
42153 views
find and rsync?
I want to be able to search for files over 14 days and over 10k and than rsync those found files to a destination. Is there a way to combine these two commands? find ./ -mtime +14 -size +10k rsync --remove-sent-files -avz /src /dest
I want to be able to search for files over 14 days and over 10k and than rsync those found files to a destination. Is there a way to combine these two commands? find ./ -mtime +14 -size +10k rsync --remove-sent-files -avz /src /dest
mkrouse (959 rep)
Aug 16, 2013, 05:58 PM • Last activity: Jun 2, 2025, 12:31 PM
0 votes
1 answers
48 views
list dependencies of all the shared libraries
I want to list the dependencies of all the shared libraries residing in a directory. I'm using `ldd -r` command to list the dependencies of a given shared library and redirect that output to a text file using `>>`. The problem I'm facing is, in the redirected output I'm not able to identify which sh...
I want to list the dependencies of all the shared libraries residing in a directory. I'm using ldd -r command to list the dependencies of a given shared library and redirect that output to a text file using >>. The problem I'm facing is, in the redirected output I'm not able to identify which shared library dependency is getting listed below. Command find /path/to/lib/directory/ -type f -name "*.so.*.*" -exec ldd -r {} \; >> ~/output.txt Current Output linux-vdso.so.1 (0x00007ffde59b0000) libstdc++.so.6 => /lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007fbf1a168000) libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007fbf1a13a000) libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fbf19f28000) libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fbf19e3f000) /lib64/ld-linux-x86-64.so.2 (0x00007fbf1a595000) linux-vdso.so.1 (0x00007ffd047fa000) libstdc++.so.6 => /lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007f97509f9000) libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f97509cb000) libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f97507b9000) libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f97506d0000) /lib64/ld-linux-x86-64.so.2 (0x00007f9750d8b000) ... Expected Output file1.so linux-vdso.so.1 (0x00007ffde59b0000) libstdc++.so.6 => /lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007fbf1a168000) libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007fbf1a13a000) libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fbf19f28000) libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fbf19e3f000) /lib64/ld-linux-x86-64.so.2 (0x00007fbf1a595000) file2.so linux-vdso.so.1 (0x00007ffd047fa000) libstdc++.so.6 => /lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007f97509f9000) libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f97509cb000) libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f97507b9000) libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f97506d0000) /lib64/ld-linux-x86-64.so.2 (0x00007f9750d8b000)
Harry (165 rep)
May 28, 2025, 05:26 AM • Last activity: May 28, 2025, 06:10 AM
0 votes
1 answers
39 views
find or ls -l recursive output in YYYY-MM-DD format on FreeBSD
I'm trying to list files recursively with this date format YYYY-MM-DD on a on FreeBSD. I tried the Linux commands I know but none of them worked : ```shell $ ls -lR --time-style=+'%Y-%m-%d %T' ls: unrecognized option '--time-style=+%Y-%m-%d %T' usage: ls [-ABCFGHIJLOPRSTUWYZabcdefghiklmnopqrstuwxyz1...
I'm trying to list files recursively with this date format YYYY-MM-DD on a on FreeBSD. I tried the Linux commands I know but none of them worked :
$ ls -lR --time-style=+'%Y-%m-%d %T'
ls: unrecognized option '--time-style=+%Y-%m-%d %T'
usage: ls [-ABCFGHIJLOPRSTUWYZabcdefghiklmnopqrstuwxyz1,] [-D format] [file ...]
$ find . -printf "%TY-%Tm-%Td %p\n"
find: -printf: unknown primary or operator
$
How can I do that ? EDIT0 : I added the recursive word in the title because it was only in the content of my question. BTW : I forgot that ls -lR is a pain to grep so I'm more interrested in a find like solution :)
SebMa (2433 rep)
May 15, 2025, 04:04 PM • Last activity: May 15, 2025, 04:50 PM
7 votes
8 answers
18898 views
Remove numbers from the start of filenames
I've a problem modifying the files' names in my `Music/` directory. I have a list of names like these: $ ls 01 American Idiot.mp3 01 Articolo 31 - Domani Smetto.mp3 01 Bohemian rapsody.mp3 01 Eye of the Tiger.mp3 04 Halo.mp3 04 Indietro.mp3 04 You Can't Hurry Love.mp3 05 Beautiful girls.mp3 16 Apolo...
I've a problem modifying the files' names in my Music/ directory. I have a list of names like these: $ ls 01 American Idiot.mp3 01 Articolo 31 - Domani Smetto.mp3 01 Bohemian rapsody.mp3 01 Eye of the Tiger.mp3 04 Halo.mp3 04 Indietro.mp3 04 You Can't Hurry Love.mp3 05 Beautiful girls.mp3 16 Apologize.mp3 16 Christmas Is All Around.mp3 Adam's song.mp3 A far l'amore comincia tu.mp3 All By My Self.MP3 Always.mp3 Angel.mp3 And similar and I would like to cut all the numbers in front of the filenames (not the 3 in the extension). I've tried first to grep only the files with the number with find -exec or xargs but even at this first step I had no success. After being able to grep I'd like doing the actual name change. This is what I tried by now: ls > try-expression grep -E '^[0-9]+' try-expression and with the above I got the right result. Then I tried the next step: ls | xargs -0 grep -E '^[0-9]+' ls | xargs -d '\n' grep -E '^[0-9]+' find . -name '[0-9]+' -exec grep -E '^[0-9]+' {} \; ls | parallel bash -c "grep -E '^[0-9]+'" - {} And similar but I got error like 'File name too long' or no output at all. I guess the problem is the way I'm using xargs or find as expressions in separate commands work well. Thank you for your help
Luigi Tiburzi (887 rep)
May 29, 2012, 08:07 AM • Last activity: May 14, 2025, 11:19 AM
Showing page 1 of 20 total questions