Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged cut from sorted by
Terminal - Commands tagged cut - 61 results
for each in `cut -d " " -f 1 inputfile.txt`; do echo "select * from table where id = \"$each\";"; done
2009-09-23 13:29:16
User: hfs
Functions: echo
Tags: echo cut for-each
0

I never can remember the syntax of awk. You can give a different -d option to cut to separate by e.g. commas. Also this allows to do more things with the generated SQL, e.g. to redirect it into different files.

du -a --max-depth=1 | sort -n | cut -d/ -f2 | sed '$d' | while read i; do if [ -f $i ]; then du -h "$i"; else echo "$(du -h --max-depth=0 "$i")/"; fi; done
2009-09-03 20:43:43
User: nickwe
Functions: cut du echo read sed sort
3

Based on the MrMerry one, just add some visuals to differentiate files and directories

find . -maxdepth 1 -type d|xargs du -a --max-depth=0|sort -rn|cut -d/ -f2|sed '1d'|while read i;do echo "$(du -h --max-depth=0 "$i")/";done;find . -maxdepth 1 -type f|xargs du -a|sort -rn|cut -d/ -f2|sed '$d'|while read i;do du -h "$i";done
2009-09-03 20:33:21
User: nickwe
Functions: cut du echo find read sed sort xargs
2

Based on the MrMerry one, just add some visuals and sort directory and files

for u in `cut -f1 -d: /etc/passwd`; do echo -n $u:; groups $u; done | sort
2009-08-22 09:06:02
User: hemanth
Functions: echo groups
Tags: sort cut for groups
3

"cut" the user names from /etc/passwd and then running a loop over them.

dd if=/dev/urandom count=200 bs=1 2>/dev/null | tr "\n" " " | sed 's/[^a-zA-Z0-9]//g' | cut -c-16
grep -Eho '<[a-ZA-Z_][a-zA-Z0-9_-:]*' * | sort -u | cut -c2-
2009-08-05 21:54:29
User: inkel
Functions: cut grep sort
Tags: sort grep cut xml
0

This one will work a little better, the regular expressions it is not 100% accurate for XML parsing but it will suffice any XML valid document for sure.

echo -e "HEAD / HTTP/1.1\nHost: slashdot.org\n\n" | nc slashdot.org 80 | head -n5 | tail -1 | cut -f2 -d-
$ grep -or string path/ | wc -l
grep -rc logged_in app/ | cut -d : -f 2 | awk '{sum+=$1} END {print sum}'
2009-07-15 14:16:44
User: terceiro
Functions: awk cut grep
-2

grep's -c outputs how may matches there are for a given file as "file:N", cut takes the N's and awk does the sum.

grep -h -o '<[^/!?][^ >]*' * | sort -u | cut -c2-
2009-06-17 00:22:18
User: thebodzio
Functions: cut grep sort
Tags: sort grep cut
2

This set of commands was very convenient for me when I was preparing some xml files for typesetting a book. I wanted to check what styles I had to prepare but coudn't remember all tags that I used. This one saved me from error-prone browsing of all my files. It should be also useful if one tries to process xml files with xsl, when using own xml application.

netstat -ntu | awk '{print $5}' | cut -d: -f1 | sort | uniq -c | sort -n
2009-03-28 21:02:26
User: tiagofischer
Functions: awk cut netstat sort uniq
14

Here is a command line to run on your server if you think your server is under attack. It prints our a list of open connections to your server and sorts them by amount.

BSD Version:

netstat -na |awk '{print $5}' |cut -d "." -f1,2,3,4 |sort |uniq -c |sort -nr