What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags





Commands by unixmonkey8504 from sorted by
Terminal - Commands by unixmonkey8504 - 9 results
Server: nc -l 1234 |tar xvfpz - ;Client: tar zcfp - /path/to/dir | nc localhost 1234
2010-03-02 14:24:04
Functions: tar

Create a tarball on the client and send it across the network with netcat on port 1234 where its extracted on the server in the current directory.

while read f;do echo "$f";done < <(find .)
2010-03-02 14:22:22
Functions: echo find read

Read all contents from current directory and display to stdout.

find . |while read f;do echo "$f";done
2010-03-02 14:21:15
Functions: echo find read

Read all contents from current directory and display it on stdout.

find -name ".php" -exec perl -pi -e 's/search/replace/g/' {} \;
find . type f -exec echo http://exg.com/{} \; > file
2010-03-02 14:18:01
Functions: echo find type

find all files in cur dir add to url and append to file

ssh -c 'tar cvzf - -C /path/to/src/*' | tar xzf -
2010-03-02 14:15:17
Functions: ssh tar

Create tarball on stdout which is piped to tar reading from stdin all over ssh

url="$my_url";file=$(youtube-dl -s -e $url);wget -q -O - `youtube-dl -b -g $url`| ffmpeg -i - -f mp3 -vn -acodec libmp3lame - > "$file.mp3"
wget -erobots=off --user-agent="Mozilla/5.0 (X11; U; Linux i686; en-US; rv: Gecko/2008092416 Firefox/3.0.3" -H -r -l2 --max-redirect=1 -w 5 --random-wait -PmyBooksFolder -nd --no-parent -A.pdf http://URL
2010-03-01 20:22:44
Functions: wget

Mask the user agent as firefox, recursively download 2 levels deep from a span host with a maximum of 1 redirection, use random wait time and dump all pdf files to myBooksFolder without creating any other directories. Host will have no way of knowing that this is a grabber script.

sed 's+href="\([^"]*\)"+\n\1\n+g' bookmarks.html | grep '^http' |clive
2010-03-01 20:17:22
Functions: grep sed

Parses your exported bookmarks to generate a clean list of http lines and passes it on to clive to try to download the video file from various sites.