Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using dir from sorted by
Terminal - Commands using dir - 18 results
csvcount() { for dir in $@; do echo -e "$(find $dir -name '*.csv' | wc -l)\t$dir"; done }
for dir in ~/git/*; do (cd "$dir" && git pull); done
find . -type d | while read dir ; do num=`ls -l $dir | grep '^-' | wc -l` ; echo "$num $dir" ; done | sort -rnk1 | head
dir /ad /s /b
dir -C -1 -N -RNCCI /dir/ > file.csv
dir="/bin"; man $(ls $dir |sed -n "$(echo $(( $RANDOM % $(ls $dir |wc -l | awk "{ print $1; }" ) + 1 )) )p")
2010-08-20 16:31:50
User: camocrazed
Functions: dir ls man sed
Tags: man sed awk echo wc
-2

Broaden your knowledge of the utilities available to you in no particular order whatsoever! Then use that knowledge to create more nifty one-liners that you can post here. =p

Takes a random number modulo the number of files in $dir, prints the filename corresponding to that number, and passes it as an argument to man.

for dir in $(find -type d ! -name CVS); do for file in $(find $dir -maxdepth 1 -type f); do rm $file; cvs delete $file; done; done
2010-04-27 16:03:33
User: ubersoldat
Functions: cvs dir file find rm
Tags: bash cvs delete rm
1

This will search all directories and ignore the CVS ones. Then it will search all files in the resulting directories and act on them.

dir='path to file'; tar cpf - "$dir" | pv -s $(du -sb "$dir" | awk '{print $1}') | tar xpf - -C /other/path
2010-01-19 19:05:45
User: starchox
Functions: awk dir du tar
Tags: copy tar cp
-2

This may seem like a long command, but it is great for making sure all file permissions are kept in tact. What it is doing is streaming the files in a sub-shell and then untarring them in the target directory. Please note that the -z command should not be used for local files and no perfomance increase will be visible as overhead processing (CPU) will be evident, and will slow down the copy.

You also may keep simple with, but you don't have the progress info:

cp -rpf /some/directory /other/path
for /f "delims==" %a in (' dir "%USERPROFILE%\*.sqlite" /s/b ') do echo vacuum;|"sqlite3.exe" "%a"
2010-01-18 20:56:00
User: vutcovici
Functions: dir echo
-3

This command defragment the SQLite databases found in the home folder of the current Windows user.

This is usefull to speed up Firefox startup.

The executable sqlite3.exe must be located in PATH or in the current folder.

In a script use:

for /f "delims==" %%a in (' dir "%USERPROFILE%\*.sqlite" /s/b ') do echo vacuum;|"sqlite3.exe" "%%a"
find dir -size -1024k -type f -print0 | du --files0-from - -bc
2009-12-29 01:33:55
User: bhepple
Functions: dir du find
Tags: size sum
2

The original didn't use -print0 which fails on weird file names eg with spaces.

The original parsed the output of 'ls -l' which is always a bad idea.

find dir -size -1024k -type f | xargs -d $'\n' -n1 ls -l | cut -d ' ' -f 5 | sed -e '2,$s/$/+/' -e '$ap' | dc
2009-12-28 04:23:01
User: zhangweiwu
Functions: cut dir find ls sed xargs
Tags: size sum
1

The command gives size of all files smaller than 1024k, this information, together with disk usage, can help determin file system parameter (e.g. block size) or storage device (e.g. SSD v.s. HDD).

Note if you use awk instead of "cut| dc", you easily breach maximum allowed number of records in awk.

dir=$(pwd); while [ ! -z "$dir" ]; do ls -ld "$dir"; dir=${dir%/*}; done; ls -ld /
2009-12-14 14:38:11
User: hfs
Functions: dir ls
2

Useful if a different user cannot access some directory and you want to know which directory on the way misses the x bit.

for %f in (c) do dir %f:\*.jpg /s /p
2009-05-05 18:28:18
User: copremesis
Functions: dir
-5

there is no explicit find command in DOS you can create a batch file with this one and find all jpegs on the C drive ...

note: if creating a batch file "find.bat" the syntax changes to:

for %%f in (c) do dir %%f:\%1 /s /p

you can then use

find *.jpg
#!/bin/sh for dir in `ls -A | grep -v .sh`; do chown -R $dir:$dir $dir done
date -d "@$(find dir -type f -printf '%C@\n' | sort -n | sed -n "$(($(find dir -type f | wc -l)/2))p")" +%F
2009-03-24 18:48:49
User: allengarvin
Functions: date dir find wc
-1

I needed to get a feel for how "old" different websites were, based on their directories.

for dir in $(ls); do du -sk ${dir}; done
2009-03-24 13:42:55
User: morlockhq_
Functions: dir du
-15

Sometimes you want to know the summary of the sizes of directories without seeing the details in their subdirectories. Especially if it is going to just scroll off the screen. This one liner summarizes the disk usage of any number of directories in a directory without giving all the details of whats happening underneath.

[[ -d dir ]] || mkdir dir ; cd dir
2009-03-12 17:19:13
User: voyeg3r
Functions: cd dir mkdir
1

For use in scripts this command is very usefull

export IFS=$'\n';for dir in $( ls -l | grep ^d | cut -c 52-);do du -sh $dir; done