Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Hide

Credits

Commands using find from sorted by
Terminal - Commands using find - 1,061 results
find <path> |xargs grep <pattern>
find /dev/disk/by-id -type l -printf "%l\t%f\n" | cut -b7- | sort
find . -name '._*' -type f -delete
2015-05-16 18:12:50
User: MarcLaf
Functions: find
Tags: mac os x
2

Searches from present dir forward and removes all Mac generated . (dot) files.

find . -path "*/any_depth/*" -exec grep "needle" {} +
find . -type f -name '*' -exec md5sum '{}' + > hashes.txt
find -name pom.xml | while read f; do cd $(dirname "$f"); mvn clean; cd -; done;
2015-04-15 21:24:49
User: glaudiston
Functions: cd dirname find read
-2

this command is used to locate all pom.xml files, access the dir and do a mvn clean, but I do recommend you to disable network interfaces to not download dependencies packages to be faster.

find /PATHNAME -type l | while read nullsymlink ; do wrongpath=$(readlink "$nullsymlink") ; right=$(echo "$wrongpath" | sed s'|OLD_STRING|NEW_STRING|') ; ln -fs "$right" "$nullsymlink" ; done
2015-04-14 14:58:41
User: iDudo
Functions: echo find ln read readlink sed
0

After you run this script, you can check status for broken symlink with this command:

find -L . -type l

find . -type f -exec echo -n "touch -t \`echo " \; -exec echo -n {} \; -exec echo -n " | sed -E 's/.*([[:digit:]]{8})_([[:digit:]]{4})([[:digit:]]{2}).*/\1\2.\3/g'\` " \; -exec echo {} \; | sh
findfile() { find . -type f -iname "*${*}*" ; }
2015-01-01 03:15:51
User: Xk2c
Functions: find
Tags: find function
-4

Actually your func will find both files and directorys that contain ${1}.

This one only find files.

..and to look only for dirs:

finddir() { find . -type d -iname "*${*}*" ; }

finame(){ find . -iname "*$1*"; }
2014-12-31 22:33:08
Functions: find
Tags: find function
1

It looks for files that contains the given word as parameter.

* case insensitive

* matches files containing the given word.

find . -name '*.php' | xargs wc -l
2014-12-24 11:15:18
User: erez83
Functions: find wc xargs
Tags: count code
0

count all the lines of code in specific directory recursively

in this case only *.php

can be *.*

find . -printf '%.5m %10M %#9u %-9g %TY-%Tm-%Td+%Tr [%Y] %s %p\n'|sort -nrk8|head
find /srv/code -maxdepth 4 -type f -regex ".*\(\(package\|composer|npm\\|bower\)\.json\|Gemfile\|requirements\.txt\\|\.gitmodules\)"
2014-11-28 16:34:35
User: renoirb
Functions: find
Tags: bash git PHP ruby
0

List all dependencies manifests so you can install them.

In a scenario where you want to deploy a number of web applications and run their dependency managers, how could you run all of them in a systematic order.

One of the complexity is to ensure you get only your own top level dependencies. That way, you don recursively call development dependencies of your own dependencies.

Otherwise you might end up discovering dependency management manifests that are already been pulled by your own projects.

# Using this command

This command helps me find them and I can then run what?s required to pull them from their respective sources.

This command assumes the following:

1. Your code checkouts are in a flat repository layout (i.e. not nested).

2. Finds manifests for:

- NPM (nodejs),

- Composer (php),

- bower,

- requirements.txt (Python), and

- git submodules

find . -type f -name "*\?*" | while read f;do mv "$f" "${f//[^0-9A-Za-z.\/\(\)\ ]/_}";done
2014-11-28 14:55:27
User: miccaman
Functions: find mv read
Tags: bash find mv
2

replace all "?" characters in filename to underscore

find . -type d -name "*\?*" | while read f;do mv "$f" "${f//[^0-9A-Za-z.\/\(\)\ ]/_}";done
2014-11-28 14:52:46
User: miccaman
Functions: find mv read
Tags: bash find mv
0

rename all dirs with "?" char in name, leave spaces and () in place

touch -t 197001010000 ./tmp && find . -newer ./tmp && rm -f ./tmp
2014-11-18 00:29:26
User: sergeylukin
Functions: find rm touch
-1

Sometimes you just want to operate on files that were created after specific date. This command consists of 3 commands:

- Create a dummy file with the custom date

- Find all files with "creation time" further than our custom date by using `-newer` find option. Add your crazy stuff here, like moving, deleting, printing, etc.

- Remove the dummy file

find . -name "*.pdf" -exec pdftk {} dump_data output \; | grep NumberOfPages | awk '{print $1,$2}'
2014-11-14 23:36:56
User: mtrgrrl
Functions: awk find grep
0

using awk, changed the line given by sucotronic in command #11733 to print the first and second columns

find ./i18n -name "*.po" | while read f; do msgfmt $f -o ${f%.po}.mo; done
2014-11-14 19:14:35
User: sergeylukin
Functions: find read
0

This command takes all `.po` files inside `i18n` directory and compiles them to `.mo` files with same basename

find -type f -exec ffmpeg -i "{}" "{}".mp3 \;
find . -name *.png | xargs optipng -nc -nb -o7 -full
find . -name '*.jar' | xargs -l jar vtf | grep XXX.java
find -not -empty -type f -printf "%-30s'\t\"%h/%f\"\n" | sort -rn -t$'\t' | uniq -w30 -D | cut -f 2 -d $'\t' | xargs md5sum | sort | uniq -w32 --all-repeated=separate
2014-10-19 02:00:55
User: fobos3
Functions: cut find md5sum sort uniq xargs
1

Finds duplicates based on MD5 sum. Compares only files with the same size. Performance improvements on:

find -not -empty -type f -printf "%s\n" | sort -rn | uniq -d | xargs -I{} -n1 find -type f -size {}c -print0 | xargs -0 md5sum | sort | uniq -w32 --all-repeated=separate

The new version takes around 3 seconds where the old version took around 17 minutes. The bottle neck in the old command was the second find. It searches for the files with the specified file size. The new version keeps the file path and size from the beginning.

find . -iname "*.mp4" -print0 | xargs -0 mv --verbose -t /media/backup/
cd tmp ; find . |cpio -o -H newc| gzip > ../initrd.gz
2014-09-24 14:07:54
User: akiuni
Functions: cd cpio find gzip
0

This commands compresses the "tmp" directory into an initrd file.

for file in $(find /var/backup -name "backup*" -type f |sort -r | tail -n +10); do rm -f $file; done ; tar czf /var/backup/backup-system-$(date "+\%Y\%m\%d\%H\%M-\%N").tgz --exclude /home/dummy /etc /home /opt 2>&- && echo "system backup ok"
2014-09-24 14:04:11
User: akiuni
Functions: date echo file find rm sort tail tar
Tags: backup Linux cron
0

this command can be added to crontab so as to execute a nightly backup of directories and store only the 10 last backup files.