Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using cut from sorted by
Terminal - Commands using cut - 468 results
dpkg -l linux-* | awk '/^ii/{ print $2}' | grep -v -e `uname -r | cut -f1,2 -d"-"` | grep -e [0-9] | xargs sudo apt-get -y purge
phpunit --log-json php://stdout | awk '$NF ~ '/,/' && $1 ~ /"(test|time)"/' | cut -d: -f2- | sed "N;s/\n/--/" | sed "s/,//"| awk 'BEGIN{FS="--"}; {print $2 $1}' | sort -r | head -n 5
gem list | cut -d" " -f1 | grep --invert-match "test-unit\|psych\|io-console\|rdoc\|json\|bigdecimal\|rake\|minitest" | xargs gem uninstall -aIx
2014-01-10 10:58:59
User: gdelfino
Functions: cut grep xargs
-1

This command removes all ruby gems except the default ones that can not be removed. It is based on http://geekystuff.net/2009/01/14/remove-all-ruby-gems/

ps -eo pmem,comm | grep chrome | cut -d " " -f 2 | paste -sd+ | bc
2014-01-03 15:33:16
User: Darkstar
Functions: cut grep paste ps
0

This command will show the sum total of memory used in gigabytes by a program that spawns multiple instances of itself. Replace chrome with whatever program's memory usage you are investigating. This command is rather useless on software that only spawns a single instance of itself.

find /mnt/storage/profiles/ -maxdepth 1 -mindepth 1 -type d | while read d; do tarfile=`echo "$d" | cut -d "/" -f5`; destdir="/local/backupdir/"; tar -g "$destdir"/"$tarfile".snar -czf "$destdir"/"$tarfile"_`date +%F`.tgz -P $d; done
find /mnt/storage/profiles/ -maxdepth 1 -mindepth 1 -type d | while read d; do tarfile=`echo "$d" | cut -d "/" -f5`; destdir="/local/backupdir"; tar -czvf "$destdir"/"$tarfile"_`date +%F`.tgz -P $d; done
2013-12-05 19:18:03
User: jaimerosario
Functions: cut find read tar
1

Problem: I wanted to backup user data individually, using and incremental method. In this example, all user data is located in "/mnt/storage/profiles", and about 25 folders inside, each with a username ( /mnt/storage/profiles/mike; /mnt/storage/profiles/lucy ...)

I need each individual folder backed up, not the whole "/mnt/storage/profiles". So, using find while excluding directories depth and creating two variables (tarfile=username & desdir=destination), tar will create a .tgz file for each folder, resulting in a "mike_2013-12-05.tgz" and "lucy_2013-12-05.tgz".

find /mnt/storage/profiles/ -maxdepth 1 -mindepth 1 -type d | while read d; do tarfile=`echo "$d" | cut -d "/" -f5`; destdir="/local/backupdir/"; tar -czf $destdir/"$tarfile"_full.tgz -P $d; done
2013-12-05 19:07:17
User: jaimerosario
Functions: cut find read tar
1

Problem: I wanted to backup user data individually. In this example, all user data is located in "/mnt/storage/profiles", and about 25 folders inside, each with a username ( /mnt/storage/profiles/mike; /mnt/storage/profiles/lucy ...)

I need each individual folder backed up, not the whole "/mnt/storage/profiles". So, using find while excluding directories depth and creating two variables (tarfile=username & desdir=destination), tar will create a .tgz file for each folder, resulting in a "mike_full.tgz" and "lucy_full.tgz".

getent passwd `whoami` | cut -d ':' -f 5
2013-12-05 01:37:44
User: timhughes
Functions: cut getent passwd
0

Doesn't require finger and should work whatever the underlying auth mechanism is

wget "us.download.nvidia.com$(wget -qO- "$(wget -qO- "nvidia.com/Download/processFind.aspx?psid=95&pfid=695&osid=19&lid=1&lang=en-us"|awk '/driverResults.aspx/ {print $4}'|cut -d "'" -f2|head -n 1)"|awk '/url=/ {print $2}'|cut -d '=' -f3|cut -d '&' -f1)"
2013-11-21 03:04:59
User: lowjax
Functions: awk cut head wget
1

Download latest NVIDIA Geforce x64 Windows7-8 driver from Nvidia's website. Pulls the latest download version (which includes beta). This is the "English" version. The following command includes a 'sed' line to replace "english" with "international" if needed. You can also replace the starting subdomain with "eu." "uk." and others. Enjoy this one liner! 1 character under the max :)

wget "us.download.nvidia.com$(wget -qO- "$(wget -qO- "nvidia.com/Download/processFind.aspx?psid=95&pfid=695&osid=19&lid=1&lang=en-us" | awk '/driverResults.aspx/ {print $4}' | cut -d "'" -f2 | head -n 1)" | awk '/url=/ {print $2}' | sed -e "s/english/international/" | cut -d '=' -f3 | cut -d '&' -f1)"
wget --no-check-certificate https://www.kernel.org/$(wget -qO- --no-check-certificate https://www.kernel.org | grep tar | head -n1 | cut -d\" -f2)
vboxmanage list runningvms | cut -d \" -f 2 | xargs -I % vboxmanage controlvm % poweroff
find . -type f -not -empty -printf "%-25s%p\n"|sort -n|uniq -D -w25|cut -b26-|xargs -d"\n" -n1 md5sum|sed "s/ /\x0/"|uniq -D -w32|awk -F"\0" 'BEGIN{l="";}{if(l!=$1||l==""){printf "\n%s\0",$1}printf "\0%s",$2;l=$1}END{printf "\n"}'|sed "/^$/d"
2013-10-22 13:34:19
User: alafrosty
Functions: awk cut find sed sort uniq xargs
0

* Find all file sizes and file names from the current directory down (replace "." with a target directory as needed).

* sort the file sizes in numeric order

* List only the duplicated file sizes

* drop the file sizes so there are simply a list of files (retain order)

* calculate md5sums on all of the files

* replace the first instance of two spaces (md5sum output) with a \0

* drop the unique md5sums so only duplicate files remain listed

* Use AWK to aggregate identical files on one line.

* Remove the blank line from the beginning (This was done more efficiently by putting another "IF" into the AWK command, but then the whole line exceeded the 255 char limit).

>>>> Each output line contains the md5sum and then all of the files that have that identical md5sum. All fields are \0 delimited. All records are \n delimited.

openssl rand -hex 1 | tr '[:lower:]' '[:upper:]' | xargs echo "obase=2;ibase=16;" | bc | cut -c1-6 | sed 's/$/00/' | xargs echo "obase=16;ibase=2;" | bc | sed "s/$/:$(openssl rand -hex 5 | sed 's/\(..\)/\1:/g; s/.$//' | tr '[:lower:]' '[:upper:]')/"
2013-10-22 08:40:46
User: 4fd
Functions: bc cut echo sed tr xargs
0

I did not come up with this one myself, but found this somewhere else several months ago.

t2s() { wget -q -U Mozilla -O $(tr ' ' _ <<< "$1"| cut -b 1-15).mp3 "http://translate.google.com/translate_tts?ie=UTF-8&tl=en&q=$(tr ' ' + <<< "$1")"; }
2013-10-16 23:29:59
User: snipertyler
Functions: cut tr wget
13

Usage: t2s 'How are you?'

Nice because it automatically names the mp3 file up to 15 characters

Modified (uses bash manip instead of tr)

t2s() { wget -q -U Mozilla -O $(cut -b 1-15

for i in `cat /proc/mounts | grep /home/virtfs | cut -d ? ? -f 2 ` ; do umount $i; done
cut -d " " -f1,4 access_log | sort | uniq -c | sort -rn | head
2013-09-20 21:29:32
User: zlemini
Functions: cut sort uniq
1

Analyze an Apache access log for the time period with most activity and display the hit count, requesting IP and the timestamp. May help detect a brute force dos attack.

cut -d ' ' -f 1 /var/log/apache2/access_logs | uniq -c | sort -n
2013-09-17 20:05:03
User: BorneBjoern
Functions: cut sort uniq
0

avoiding UUOC!

cut can handle files as well. No neet for a cat.

grep "10/Sep/2013" access.log| cut -d[ -f2 | cut -d] -f1 | awk -F: '{print $2":"$3}' | sort -nk1 -nk2 | uniq -c | awk '{ if ($1 > 10) print $0}'
netstat -ntu | awk ' $5 ~ /^(::ffff:|[0-9|])/ { gsub("::ffff:","",$5); print $5}' | cut -d: -f1 | sort | uniq -c | sort -nr
2013-09-10 19:28:06
User: mrwulf
Functions: awk cut netstat sort uniq
1

Same as the rest, but handle IPv6 short IPs. Also, sort in the order that you're probably looking for.

cat /var/log/apache2/access_logs | cut -d' ' -f1 | sort | uniq -c | sort -n
2013-09-07 23:57:31
User: while0pass
Functions: cat cut sort uniq
1

The first sort is necessary for ips in a list to be actually unique.

for I in $(awk -v LIMIT=500 -F: '($3>=LIMIT) && ($3!=65534)' /etc/passwd | cut -f 1-1 -d ':' | xargs); do usermod -g YOURGROUP $I ; done
for i in `ip addr show dev eth1 | grep inet | awk '{print $2}' | cut -d/ -f1`; do echo -n $i; echo -en '\t'; host $i | awk '{print $5}'; done
cat /var/log/apache2/access_logs | cut -d ' ' -f 1 | uniq -c | sort -n
2013-09-02 13:04:47
User: basvdburg
Functions: cat cut sort uniq
1

Show's per IP of how many requests they did to the Apache webserver

diff <(ssh-keygen -y -f ~/.ssh/id_rsa) <(cut -d' ' -f1,2 ~/.ssh/id_rsa.pub)
mogrify -format gif -auto-orient -thumbnail 250x90 '*.JPG'&&(echo "<ul>";for i in *.gif;do basename=$(echo $i|rev|cut -d. -f2-|rev);echo "<li style='display:inline-block'><a href='$basename.JPG'><img src='$basename.gif'></a>";done;echo "</ul>")>list.html
2013-08-25 20:45:49
User: ysangkok
Functions: cut echo
1

The input images are assume to have the "JPG" extension. Mogrify will overwrite any gif images with the same name! Will not work with names with spaces.