Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using cut from sorted by
Terminal - Commands using cut - 452 results
cut -d ' ' -f 1 /var/log/apache2/access_logs | uniq -c | sort -n
2013-09-17 20:05:03
User: BorneBjoern
Functions: cut sort uniq
0

avoiding UUOC!

cut can handle files as well. No neet for a cat.

grep "10/Sep/2013" access.log| cut -d[ -f2 | cut -d] -f1 | awk -F: '{print $2":"$3}' | sort -nk1 -nk2 | uniq -c | awk '{ if ($1 > 10) print $0}'
netstat -ntu | awk ' $5 ~ /^(::ffff:|[0-9|])/ { gsub("::ffff:","",$5); print $5}' | cut -d: -f1 | sort | uniq -c | sort -nr
2013-09-10 19:28:06
User: mrwulf
Functions: awk cut netstat sort uniq
1

Same as the rest, but handle IPv6 short IPs. Also, sort in the order that you're probably looking for.

cat /var/log/apache2/access_logs | cut -d' ' -f1 | sort | uniq -c | sort -n
2013-09-07 23:57:31
User: while0pass
Functions: cat cut sort uniq
1

The first sort is necessary for ips in a list to be actually unique.

for I in $(awk -v LIMIT=500 -F: '($3>=LIMIT) && ($3!=65534)' /etc/passwd | cut -f 1-1 -d ':' | xargs); do usermod -g YOURGROUP $I ; done
for i in `ip addr show dev eth1 | grep inet | awk '{print $2}' | cut -d/ -f1`; do echo -n $i; echo -en '\t'; host $i | awk '{print $5}'; done
cat /var/log/apache2/access_logs | cut -d ' ' -f 1 | uniq -c | sort -n
2013-09-02 13:04:47
User: basvdburg
Functions: cat cut sort uniq
1

Show's per IP of how many requests they did to the Apache webserver

diff <(ssh-keygen -y -f ~/.ssh/id_rsa) <(cut -d' ' -f1,2 ~/.ssh/id_rsa.pub)
mogrify -format gif -auto-orient -thumbnail 250x90 '*.JPG'&&(echo "<ul>";for i in *.gif;do basename=$(echo $i|rev|cut -d. -f2-|rev);echo "<li style='display:inline-block'><a href='$basename.JPG'><img src='$basename.gif'></a>";done;echo "</ul>")>list.html
2013-08-25 20:45:49
User: ysangkok
Functions: cut echo
1

The input images are assume to have the "JPG" extension. Mogrify will overwrite any gif images with the same name! Will not work with names with spaces.

curl -s poncho.is/forecast/new_york/today/ | grep -E 'og:title|og:description' | cut -d\" -f4 | awk '{print $0,"<p>"}' | lynx -stdin -dump
diff <(sort <(md5deep -r /directory/1/) |cut -f1 -d' ') <(sort <(md5deep -r /directory/2/) |cut -f1 -d' ')
2013-08-18 22:13:07
Functions: cut diff sort
Tags: bash Linux diff
0

Compute the md5 checksums for the contents of two mirrored directories, then sort and diff the results. If everything matches, nothing is returned. Otherwise, any checksums which do not match, or which exist in one tree but not the other, are returned. As you might imagine, the output is useful only if no errors are found, because only the checksums, not filenames, are returned. I hope to address this, or that someone else will!

ip addr show |grep -w inet | grep -v 127.0.0.1 | awk '{ print $2}'| cut -d "/" -f 2
2013-08-17 18:54:23
User: htmlgifted
Functions: awk cut grep
0

This command give you Just the Network Cidr Notation

find . -type f -print0 | xargs -0 stat -c'%Y :%y %12s %n' | sort -nr | cut -d: -f2- | head
2013-08-03 09:53:46
User: HerbCSO
Functions: cut find sort stat xargs
4

Goes through all files in the directory specified, uses `stat` to print out last modification time, then sorts numerically in reverse, then uses cut to remove the modified epoch timestamp and finally head to only output the last 10 modified files.

Note that on a Mac `stat` won't work like this, you'll need to use either:

find . -type f -print0 | xargs -0 stat -f '%m%t%Sm %12z %N' | sort -nr | cut -f2- | head

or alternatively do a `brew install coreutils` and then replace `stat` with `gstat` in the original command.

for m in `df -P | awk -F ' ' '{print $NF}' | sed -e "1d"`;do n=`df -P | grep "$m$" | awk -F ' ' '{print $5}' | cut -d% -f1`;i=0;if [[ $n =~ ^-?[0-9]+$ ]];then printf '%-25s' $m;while [ $i -lt $n ];do echo -n '=';let "i=$i+1";done;echo " $n";fi;done
2013-07-29 20:12:39
User: drockney
Functions: awk cut echo grep printf sed
Tags: bash
5

Automatically drops mount points that have non-numeric sizes (e.g. /proc). Tested in bash on Linux and AIX.

du -hs `du -sk * | sort -rn | cut -f2-`
cut -d, -f1 /var/opt/example/dumpfile.130610_subscriber.csv | cut -c3-5 | sort | uniq -c | sed -e 's/^ *//;/^$/d' | awk -F" " '{print $2 "," $1}' > SubsxPrefix.csv
2013-07-17 07:58:56
User: neomefistox
Functions: awk cut sed sort uniq
Tags: Linux UNIX
0

dumpfile is a CSV file, which its 1st field is a phone number in format CC+10 digits

Empty lines are deleted, before the output in format "prefix,ocurrences"

find . -type d -print0 | xargs -0 du -s | sort -n | tail -10 | cut -f2 | xargs -I{} du -sh {} | sort -rn
salt -G 'os_family:Debian' cmd.run 'apt-get dist-upgrade --dry-run | grep ^Inst | cut -d" " -f2'
rcs_changes(){ rcsdiff -y --suppress-common-lines "$1" 2>/dev/null | cut -d'|' -f2; }
svn st | grep ! | cut -c 9- | while read line;do svn resolved $line;done
sudo apt-cache dumpavail | grep Package | cut -d ' ' -f 2 > available.packages
curl -k https://Username:Password@api.del.icio.us/v1/posts/all?red=api | xml2| \grep '@href' | cut -d\= -f 2- | sort | uniq | linkchecker -r0 --stdin --complete -v -t 50 -F blacklist
2013-05-04 17:43:21
User: bbelt16ag
Functions: cut sort uniq
-1

This commands queries the delicious api then runs the xml through xml2, grabs the urls cuts out the first two columns, passes through uniq to remove duplicates if any, and then goes into linkchecker who checks the links. the links go the blacklist in ~/.linkchecker/blacklist. please see the manual pages for further info peeps. I took me a few days to figure this one out. I how you enjoy it. Also don't run these api more then once a few seconds you can get banned by delicious see their site for info. ~updated for no recursive

groups $(cut -f1 -d":" /etc/passwd) | sort
ifconfig -a | awk '/Bcast/{print $2}' | cut -c 5-19
ifconfig -a | awk '/Bcast/{print $2}' | cut -c 5-19