Commands tagged find (410)

  • This is a slightly modified version of http://www.commandlinefu.com/commands/view/4283/recursive-search-and-replace-old-with-new-string-inside-files (which did not work due to incorrect syntax) with the added option to sed inside only files named filename.ext


    1
    find . -type f -name filename.exe -exec sed -i "s/oldstring/oldstring/g" {} +;
    eddieb · 2013-05-20 19:17:10 11
  • Count your source and header file's line numbers. This ignores blank lines, C++ style comments, single line C style comments. This will not ignore blank lines with tabs or multiline C style comments.


    1
    find /usr/include/ -name '*.[c|h]pp' -o -name '*.[ch]' -print0 | xargs -0 cat | grep -v "^ *$" | grep -v "^ *//" | grep -v "^ */\*.*\*/" | wc -l
    unixmonkey44446 · 2013-06-17 08:37:37 11
  • Sorts by latest modified files by looking to current directory and all subdirectories Show Sample Output


    1
    find . -name '*pdf*' -print0 | xargs -0 ls -lt | head -20
    fuats · 2013-10-03 21:58:51 9
  • Better than: nodetool clearsnapshot


    1
    find /var/lib/cassandra/data -depth -type d -iwholename "*/snapshots/*" -mtime +30 -print0 | xargs -0 rm -rf
    mrwulf · 2013-11-08 14:47:29 6
  • Useful when you want to cron a daily deletion task in order to keep files not older than one year. The command excludes .snapshot directory to prevent backup deletion. One can append -delete to this command to delete the files : find /path/to/directory -not \( -name .snapshot -prune \) -type f -mtime +365 -delete


    1
    find /path/to/directory -not \( -name .snapshot -prune \) -type f -mtime +365
    cuberri · 2013-12-11 14:51:53 21
  • Find files and calculate size with stat of result in shell


    1
    find . -name "pattern" -exec stat -c%s {} \; | awk '{total += $1} END {print total}'
    Koobiac · 2014-01-15 11:07:09 8

  • 1
    find * -regextype posix-extended -regex '.*\.(ext_1|ext_2)' -exec cp {} copy_target_directory \;
    shawn_abdushakur · 2014-02-06 15:58:32 8
  • Btrfs reports the inode numbers of files with failed checksums. Use `find` to lookup the file names of those inodes. The files may need to be deleted and replaced with backups.


    1
    dmesg | grep -Po 'csum failed ino\S* \d+' | awk '{print $4}' | sort -u | xargs -n 1 find / -inum 2> /dev/null
    Sepero · 2014-03-22 12:22:46 9
  • Finds all symbolic links in the specified directory which match the specified string pattern. I used this when upgrading from an Apple-supported version of Java 6 (1.6.0_65) to an Oracle-supported version (1.7.0_55) on Mac OS X 10.8.5 to find out which executables were pointing to /System/Library/Frameworks/JavaVM.framework/Versions/Current/Commands (Apple version) vs. /Library/Java/JavaVirtualMachines/jdk1.7.0_55.jdk/Contents/Home/bin (Oracle version). However, it appears the current JDK installation script already takes care of modifying the links. Show Sample Output


    1
    find directory -type l -lname string
    gumption · 2014-05-02 14:44:24 8
  • Find biggest files in a directory Show Sample Output


    1
    find . -printf '%.5m %10M %#9u %-9g %TY-%Tm-%Td+%Tr [%Y] %s %p\n'|sort -nrk8|head
    AskApache · 2014-12-10 23:48:20 9
  • It looks for files that contains the given word as parameter. * case insensitive * matches files containing the given word. Show Sample Output


    1
    finame(){ find . -iname "*$1*"; }
    ivanalejandro0 · 2014-12-31 22:33:08 10
  • With this version, you can list all symlinks in the current directory (no subdirectories), and have it list both the link and the target. Show Sample Output


    1
    ls -l `find ~ -maxdepth 1 -type l -print`
    skittleys · 2015-01-04 02:36:47 7
  • It will highlight non-ascii character in a file. those character can cause problem for some application parsing ascii file.


    1
    grep --color='auto' -P -n '[^\x00-\x7F]' my_file.txt
    mrvkino · 2015-03-03 15:03:42 8
  • List all open files of all processes. . find /proc/*/fd Look through the /proc file descriptors . -xtype f list only symlinks to file . -printf "%l\n" print the symlink target . grep -P '^/(?!dev|proc|sys)' ignore files from /dev /proc or /sys . sort | uniq -c | sort -n count the results . Many processes will create and immediately delete temporary files. These can the filtered out by adding: ... | grep -v " (deleted)$" | ... Show Sample Output


    1
    find /proc/*/fd -xtype f -printf "%l\n" | grep -P '^/(?!dev|proc|sys)' | sort | uniq -c | sort -n
    flatcap · 2015-08-18 17:58:21 11
  • This lists all files modified after calling some command using a temporal anchor.


    1
    touch .tardis; the command ; find . -newer .tardis; rm .tardis;
    BeniBela · 2015-10-15 19:18:54 14

  • 1
    find -L /path/to/check -type l | xargs rm
    sn0w · 2015-11-10 12:42:00 11
  • This is my favorite music player I use in my beloved Linux systems,server or desktop Enjoy :-) Show Sample Output


    1
    find /home/user/M?sica/ -type f -name "*.mp3" | shuf --head-count=20 --output=/home/user/playlist.m3u ; sort -R /home/user/playlist.m3u | mplayer -playlist -
    abaddon · 2016-06-10 03:04:40 20
  • this is good for variables if you have many script created files and if you want to know which one is the last created/changed one..


    1
    find . -type f -print0 | xargs -0 stat -c '%y %n' | sort -n -k 1,1 | awk 'END{print $NF}'
    emphazer · 2018-05-14 08:47:41 133

  • 1
    find . -regextype posix-egrep -regex '.+\.(c|cpp|h)$' -not -path '*/generated/*' -not -path '*/deploy/*' -print0 | xargs -0 ls -L1d
    berceanu · 2018-06-06 07:25:06 189
  • First the find command finds all files in your current directory (.). This is piped to xargs to be able to run the next shell pipeline in parallel. The xargs -P argument specifies how many processes you want to run in parallel, you can set this higher than your core count as the duration reading is mainly IO bound. The -print0 and -0 arguments of find and xargs respectively are used to easily handle files with spaces or other special characters. A subshell is executed by xargs to have a shell pipeline for each file that is found by find. This pipeline extracts the duration and converts it to a format easily parsed by awk. ffmpeg reads the file and prints a lot of information about it, grep extracts the duration line. cut and sed cut out the time information, and tr converts the last . to a : to make it easier to split by awk. awk is a specialized programming language for use in shell scripts. Here we use it to split the time elements in 4 variables and add them up. Show Sample Output


    1
    find . -print0 | xargs -0 -P 40 -n 1 sh -c 'ffmpeg -i "$1" 2>&1 | grep "Duration:" | cut -d " " -f 4 | sed "s/.$//" | tr "." ":"' - | awk -F ':' '{ sum1+=$1; sum2+=$2; sum3+=$3; sum4+=$4 } END { printf "%.0f:%.0f:%.0f.%.0f\n", sum1, sum2, sum3, sum4 }'
    pingiun · 2019-03-01 20:21:48 40
  • Replaces a string matching a pattern in one or several files found recursively in a particular folder.


    1
    find ./ -type f -name "somefile.txt" -exec sed -i -e 's/foo/bar/g' {} \;
    guillaume1306 · 2019-03-06 10:13:23 41
  • Although the need to type a password to make certain changes to the system may make perfect sense in a business or educational environment, it makes absolutely zero sense to the home user. So, if you’re at home and would rather get work done than be annoyed by what is essentially Linux’s UAC, then this command is for you.


    1
    sudo find /usr/share/polkit-1 -iname “*.policy” -exec sed -i “s/\(auth_admin\|auth_admin_keep\)/yes/g” {} \;
    realkstrawn93 · 2022-03-28 03:00:09 382
  • This example summarize size of all pdf files in /tmp directory and its subdirectories (in bytes). Replace "/tmp" with directory path of your choice and "\*pdf" or even "-iname \*pdf" with your own pattern to match specific type of files. You can replace also parameter for du to count kilo or megabytes, but because of du rounding the sum will not be correct (especially with lot of small files and megabytes counting). In some cases you could probably use sth like this: du -cb `find /tmp -type f -iname \*pdf`|tail -n 1 But be aware that this second command CANNOT count files with spaces in their names and it will cheat you, if there are some files matching the pattern that you don't have rights to read. The first oneliner is resistant to such problems (it will not count sizes of files which you cant read but will give you correct sum of rest of them). Show Sample Output


    0
    SUM=0; for FILESIZE in `find /tmp -type f -iname \*pdf -exec du -b {} \; 2>/dev/null | cut -f1` ; do (( SUM += $FILESIZE )) ; done ; echo "sum=$SUM"
    alcik · 2009-03-05 17:16:52 8

  • 0
    find /home/fizz -type f -printf '%TY-%Tm-%Td %TT %p\n' | sort
    fizz · 2009-05-20 10:45:39 6

  • 0
    find /var/logs -name * | xargs tar -jcpf logs_`date +%Y-%m-%e`.tar.bz2
    unixmonkey4063 · 2009-06-02 19:48:47 6
  • ‹ First  < 6 7 8 9 10 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Choose from a nice graphical menu which DI.FM radio station to play
This is a very simple and lightweight way to play DI.FM stations For a more complete version of the command with proper strings in the menu, try: (couldnt fit in the command field above) $zenity --list --width 500 --height 500 --title 'DI.FM' --text 'Pick a Radio' --column 'radio' --column 'url' --print-column 2 $(curl -s http://www.di.fm/ | awk -F '"' '/href="http:.*\.pls.*96k/ {print $2}' | sort | awk -F '/|\.' '{print $(NF-1) " " $0}') | xargs mplayer This command line parses the html returned from http://di.fm and display all radio stations in a nice graphical menu. After the radio is chosen, the url is passed to mplayer so the music can start dependencies: - x11 with gtk environment - zenity: simple app for displaying gtk menus (sudo apt-get install zenity on ubuntu) - mplayer: simple audio player (sudo apt-get install mplayer on ubuntu)

history autocompletion with arrow keys
This will enable the possibility to navigate in the history of the command you type with the arrow keys, example "na" and the arrow will give all command starting by na in the history.You can add these lines to your .bashrc (without &&) to use that in your default terminal.

scan multiple log subdirectories for the latest log files and tail them

cymru malware check

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

Display list of available printers

Fastest segmented parallel sync of a remote directory over ssh
Mirror a remote directory using some tricks to maximize network speed. lftp:: coolest file transfer tool ever -u: username and password (pwd is merely a placeholder if you have ~/.ssh/id_rsa) -e: execute internal lftp commands set sftp:connect-program: use some specific command instead of plain ssh ssh:: -a -x -T: disable useless things -c arcfour: use the most efficient cipher specification -o Compression=no: disable compression to save CPU mirror: copy remote dir subtree to local dir -v: be verbose (cool progress bar and speed meter, one for each file in parallel) -c: continue interrupted file transfers if possible --loop: repeat mirror until no differences found --use-pget-n=3: transfer each file with 3 independent parallel TCP connections -P 2: transfer 2 files in parallel (totalling 6 TCP connections) sftp://remotehost:22: use sftp protocol on port 22 (you can give any other port if appropriate) You can play with values for --use-pget-n and/or -P to achieve maximum speed depending on the particular network. If the files are compressible removing "-o Compression=n" can be beneficial. Better create an alias for the command.

Getting a domain from url, ex: very nice to get url from squid access.log

Find if $b is in $a in bash
Find if $b is in $a in bash

Put readline into vi mode
This lets you use your favorite vi edit keys to navigate your term. To set it permanently, put "set editing-mode vi" in your ~/.inputrc or /etc/inputrc.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: