Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 11,605 results
for gz in `find . -type f -name '*.gz' -print`; do f=`basename $gz .gz` && d=`dirname $gz` && echo -n `ls -s $gz` "... " && gunzip -c $gz | bzip2 - -c > $d/$f.bz2 && rm -f $gz && echo `ls -s $d/$f.bz2`; done
2014-03-13 08:36:24
User: pdwalker
Functions: bzip2 echo gunzip rm
0

- recompresses all gz files to bz2 files from this point and below in the directory tree

- output shows the size of the original file, and the size of the new file. Useful.

- conceptually easier to understand than playing tricks with awk and sed.

- don't like output? Use the following line:

for gz in `find . -type f -name '*.gz' -print`; do f=`basename $gz .gz` && d=`dirname $gz` && gunzip -c $gz | bzip2 - -c > $d/$f.bz2 && rm -f $gz ; done
tar -cJf myarchive.tar.xz /path/to/archive/
2014-03-13 03:34:18
User: Sepero
Functions: tar
1

Compress files or a directory to xz format. XZ has superior and faster compression than bzip2 in most cases. XZ is superior to 7zip format because it can save file permissions and other metadata data.

ls | xargs WHATEVER_COMMAND
2014-03-12 18:00:21
User: pdxdoughnut
Functions: ls xargs
-4

xargs will automatically determine how namy args are too many and only pass a reasonable number of them at a time. In the example, 500,002 file names were split across 26 instantiations of the command "echo".

ls | grep ".txt$" | xargs -i WHATEVER_COMMAND {}
/usr/bin/lynx -dump -width 500 http://127.0.0.1/whm-server-status | grep GET | awk '{print $12 $14}' | sort | uniq -c | sort -rn | head
2014-03-12 13:24:40
User: copocaneta
Functions: awk grep sort uniq
0

List the busiest scripts/files running on a cPanel server with domain showing (column $12).

netstat -tn 2>/dev/null | grep ':80 ' | awk '{print $5}' |sed -e 's/::ffff://' | cut -f1 -d: | sort | uniq -c | sort -rn | head
2014-03-12 12:43:07
User: copocaneta
Functions: awk cut grep netstat sed sort uniq
2

IP addresses and number of connections connected to port 80.

/usr/bin/lynx -dump -width 500 http://127.0.0.1/whm-server-status | awk 'BEGIN { FS = " " } ; { print $12 }' | sed '/^$/d' | sort | uniq -c | sort -n
/usr/bin/lynx -dump -width 500 http://127.0.0.1/whm-server-status | grep GET | awk '{print $12}' | sort | uniq -c | sort -rn | head
2014-03-12 12:31:34
User: copocaneta
Functions: awk grep sort uniq
0

Easiest way to obtain the busiest website list (sorted by number of process running).

find . -name "*.txt" -exec WHATEVER_COMMAND {} \;
num_errs=`grep ERROR /var/log/syslog | tee >(cat >&2) | wc -l`
2014-03-12 00:04:24
Functions: cat tee wc
0

Many circumstances call for creating variable of a summary result while still printing the original pipe. Inserting "tee >(cat >&2)" allows the command output to still be printed while permitting the same output to be processed into a variable.

lsof|gawk '$4~/txt/{next};/REG.*\(deleted\)$/{printf ">/proc/%s/fd/%d\n", $2,$4}'
2014-03-11 10:40:32
User: wejn
Functions: gawk
Tags: awk lsof gawk
1

While the posted solution works, I'm a bit uneasy about the "%d" part. This would be hyper-correct approach:

lsof|gawk '$4~/txt/{next};/REG.*\(deleted\)$/{sub(/.$/,"",$4);printf ">/proc/%s/fd/%s\n", $2,$4}'

Oh, and you gotta pipe the result to sh if you want it to actually trim the files. ;)

Btw, this approach also removes false negatives (OP's command skips any deleted files with "txt" in their name).

lsof | grep -i deleted | grep REG | grep -v txt | ruby -r 'pp' -e 'STDIN.each do |v| a = v.split(/ +/); puts `:> /proc/#{a[1]}/fd/#{a[3].chop}`; end'
2014-03-11 06:02:09
User: jim80net
Functions: grep
0

Be careful, first run:

lsof | grep -i deleted | grep REG | grep -v txt

Then, give it the boot!

killall conky
find . \( -iname "*.doc" -o -iname "*.docx" \) -type f -exec ls -l --full-time {} +|sort -k 6,7
ffmpeg -i $video -c:v prores -profile:v 2 -c:a copy ${video}.mov
cat skype_log | sed -s 's/\(\[.*\]\) \(.*\): \(.*\)/<\2> \3/'
read -p "Please enter the 4chan url: "|egrep '//i.4cdn.org/[a-z0-9]+/src/([0-9]*).(jpg|png|gif)' - -o|nl -s https:|cut -c7-|uniq|wget -nc -i - --random-wait
trash-put junkfolder
2014-03-09 00:24:09
User: Sepero
Tags: trash
1

apt-get install trash-cli

Commandline program that allows you put folders or files in the standard KDE/Unity desktop trash.

inotifywait -mr -e CREATE $HOME/bin/ | while read i; do chmod +x $(echo "$i" | sed 's/ \S* //'); done
/opt/homebrew-cask/Caskroom/vlc/2.1.0/VLC.app/Contents/MacOS/VLC --sout-avcodec-strict=-2 -I dummy $video :sout="#transcode{vcodec=h264,vb=1024,acodec=mpga,ab=256,scale=1,channels=2,audio-sync}:std{access=file,mux=mp4,dst=${video}.m4v}" vlc://quit
2014-03-08 13:53:13
User: brainstorm
0

Transcodes mpg2 files, that cannot be read by iMovie to m4v files.

sudo lshw -html > /tmp/hardware.html && xdg-open /tmp/hardware.html
2014-03-08 10:40:21
User: Sadi
Functions: sudo
0

entering this command as root may give more complete results, creating a tmp file, and immediately opening the file might be more useful

grep -r --include=*.php "something" /foo/bar
2014-03-07 13:26:12
User: avpod2
Functions: grep
0

Actually grep can do recursive search based on file extensions.

ffmpeg -re -i localFile.mp4 -c copy -f flv rtmp://server/live/streamName
2014-03-07 05:52:11
User: unni9946
0

This command is used to stream a video file as live to some streaming server like Wowza, Red5 . etc

sar -n DEV -f /var/log/sa/sa05 |awk '{if ($3 == "IFACE" || $3 == "eth0" || $2 == "eth0") {print}}'
2014-03-06 21:32:54
Functions: awk
0

Choose the /var/log/sa/saXX log based on what day you want to view. You can use ifconfig to find the name of the interface.

You can use the -s flag to specify a certain time period, e.g. -s 12:00:00 -e 14:00:00

dsquery group -samid "group_name" | dsmod group "cn=group_name",dc=example,dc=com" -addmbr