Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Top Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 12,075 results
crontest () { date +'%M %k %d %m *' |awk 'BEGIN {ORS="\t"} {print $1+2,$2,$3,$4,$5,$6}'; echo $1;}
2015-03-12 19:56:56
User: CoolHand
Functions: awk date echo
0

usage = crontest "/path/to/bin"

This version of this function will echo back the entire command so it can be copied/pasted to crontab. Should be able to be automagically appended to crontab with a bit more work. Tested on bash and zsh on linux,freebsd,aix

crontab -l -u USER | grep -v 'YOUR JOB COMMAND or PATTERN' | crontab -u USER -
2015-03-11 13:10:47
User: Koobiac
Functions: crontab grep
1

The "-u USER" is optional if root user is used

sudo iptables -A INPUT -m limit --limit 2000/sec -j ACCEPT && sudo iptables -A INPUT -j DROP
2015-03-09 20:16:17
User: qdrizh
Functions: iptables sudo
Tags: iptables
1

VPS server hosts suspect DOS attack if PPS is too high. This limits packets at the interface level. Do "sudo apt-get install iptables-persistent" to make persistent, or, if you already have, reconfigure with "sudo dpkg-reconfigure iptables-persistent"

echo 'export PROMPT_COMMAND="history -a; history -c; history -r; $PROMPT_COMMAND"' >> .bashrc
sqlite3 ~/.mozilla/firefox/*.[dD]efault/places.sqlite "SELECT strftime('%d.%m.%Y %H:%M:%S', dateAdded/1000000, 'unixepoch', 'localtime'),url FROM moz_places, moz_bookmarks WHERE moz_places.id = moz_bookmarks.fk ORDER BY dateAdded;"
2015-03-08 19:26:16
User: return13
2

Extracts yours bookmarks out of sqlite with the format:

dateAdded|url

groups user1 user2|cut -d: -f2|xargs -n1|sort|uniq -d
2015-03-04 19:12:27
User: swemarx
Functions: cut groups uniq xargs
2

Updated according to flatcap's suggestion, thanks!

grep -xFf <(groups user1|cut -f3- -d\ |sed 's/ /\n/g') <(groups user2|cut -f3- -d\ |sed 's/ /\n/g')
install -m 0400 foo bar/
2015-03-02 13:20:38
User: op4
Functions: install
Tags: backup mv cp
3

Prior to working on/modifying a file, use the 'install -m' command which can both copy files, create directories, and set their permissions at the same time. Useful when you are working in the public_html folder and need to keep the cp'd file hidden.

for f in input/*; do BN=$(basename "$f"); ffmpeg -i "$f" -vn "temp/$BN.flac"...
2015-03-01 02:48:19
Functions: basename
0

Full command:

for f in input/*; do BN=$(basename "$f"); ffmpeg -i "$f" -vn "temp/$BN.flac"; sox "temp/$BN.flac" "temp/$BN-cleaned.flac" noisered profile 0.3; ffmpeg -i "$f" -vcodec copy -an "temp/$BN-na.mp4"; ffmpeg -i "temp/$BN-na.mp4" -i "temp/$BN-cleaned.flac" "output/$BN"; done

This was over the 255 character limit and I didn't feel like deliberately obfuscating it.

1. Create 'input', 'output' and 'temp' directories.

2. Place the files that you want to remove the hiss/static/general noise from in the input directory.

3. Generate a noise reduction profile with sox using 'sox an_input_file.mp4 -n trim x y noiseprof profile', where x and y indicates a range in seconds that only the sound you want to eliminate is present in.

4. Run the command.

for i in /usr/share/cowsay/cows/*.cow; do cowsay -f $i "$i"; done
2015-02-26 20:56:45
User: wincus
4

There are lots of different cow options to use, this script will show them all

truncate --size 1G bigfile.txt
2015-02-26 11:56:27
User: ynedelchev
2

If you want to create fast a very big file for testing purposes and you do not care about its content, then you can use this command to create a file of arbitrary size within less than a second. Content of file will be all zero bytes.

The trick is that the content is just not written to the disk, instead the space for it is somehow reserved on operating system level and file system level. It would be filled when first accessed/written (not sure about the mechanism that lies below, but it makes the file creation super fast).

Instead of '1G' as in the example, you could use other modifiers like 200K for kilobytes (1024 bytes), 500M for megabytes (1024 * 1024 bytes), 20G for Gigabytes (1024*1024*1024 bytes), 30T for Terabytes (1024^4 bytes). Also P for Penta, etc...

Command tested under Linux.

xsel -bc
2015-02-26 01:11:03
User: benjabean1
1

Clears your clipboard if xsel is installed on your machine.

If your xsel is dumb, you can also use

xsel --clear --clipboard
awk '!NF || !seen[$0]++'
2015-02-25 17:03:13
User: Soubsoub
Functions: awk
1

Remove duplicate lines whilst keeping order and empty lines

sqlite3 ~/.mozilla/firefox/*.[dD]efault/places.sqlite "SELECT strftime('%d.%m.%Y %H:%M:%S', visit_date/1000000, 'unixepoch', 'localtime'),url FROM moz_places, moz_historyvisits WHERE moz_places.id = moz_historyvisits.place_id ORDER BY visit_date;"
2015-02-24 21:51:14
User: return13
7

This is the way to get access to your Firefox history...

lame -v 2 -b 192 --ti /path/to/file.jpg audio.mp3 new-audio.mp3
sed -n '/url/s#^.*url=\(.*://.*\)#\1#p' ~/.mozilla/firefox/*.[dD]efault/SDBackups/*.speeddial | sort | uniq
2015-02-17 20:56:28
User: return13
Functions: sed sort
0

For all users of https://addons.mozilla.org/de/firefox/addon/speed-dial/

jshon -e addons -a -e defaultLocale -e name -u < ~/.mozilla/firefox/*.[dD]efault/extensions.json
wget -q -O - http://www.example.com/automation/remotescript.sh | bash /dev/stdin parameter1 parameter2
2015-02-16 16:55:09
User: paulera
Functions: bash wget
0

Use this command to execute the contents of http://www.example.com/automation/remotescript.sh in the local environment. The parameters are optional.

Alterrnatives to w3m:

curl -s http://www.example.com/automation/remotescript.sh | bash /dev/stdin param1 param2 w3m -dump http://www.example.com/automation/remotescript.sh | bash /dev/stdin [param1] [param2] lynx -source http://www.example.com/automation/remotescript.sh | bash /dev/stdin [param1] [param2]
cygstart --hide -wa runas powershell -WindowStyle Hidden -Command '"&{wevtutil el | foreach{wevtutil cl $_}}"'
2015-02-15 22:56:20
User: lowjax
2

Efficiently clear all Windows Event log entries from within a Cygwin terminal. Uses "cygstart" to launch a hidden "PowerShell" session passing a Powershell command to loop through and clear all Windows Event Log entries. Very useful for troubleshooting and debugging. The command should in theory elevate you session if needed.

One liner is based on the PowerShell command:

wevtutil el | foreach { wevtutil cl $_ }
sc query state= all | awk '/SERVICE_NAME/{printf"%s:",$2;getline;gsub(/DISP.*:\ /,"");printf"%s\n",$0}' | column -ts\:
2015-02-15 22:35:10
User: lowjax
Functions: awk column
1

Outputs Windows Services service name and display name using "sc query", pipes the output to "awk" for processing, then "column" for formatting.

List All Services:

sc query state= all | awk '/SERVICE_NAME/{printf"%s:",$2;getline;gsub(/DISP.*:\ /,"");printf"%s\n",$0}' | column -ts\:

List Started Services:

sc query | awk '/SERVICE_NAME/{printf"%s:",$2;getline;gsub(/DISP.*:\ /,"");printf"%s\n",$0}' | column -ts\:

List Stopped Services:

sc query state= inactive| awk '/SERVICE_NAME/{printf"%s:",$2;getline;gsub(/DISP.*:\ /,"");printf"%s\n",$0}' | column -ts\:
lsof -i -n -P | grep -e "$(ps aux | grep node | grep -v grep | awk -F' ' '{print $2}' | xargs | awk -F' ' '{str = $1; for(i = 2; i < NF; i++) {str = str "\\|" $i} print str}')"
2015-02-14 23:24:00
User: hochmeister
Functions: grep
0

us lsof, grep for any pid matching a given name such as "node".

ip -o -4 a s | awk -F'[ /]+' '$2!~/lo/{print $4}'
2015-02-13 11:19:31
User: paulera
Functions: awk
2

To show ipv6 instead, use [[ -6 ]] instead of [[ -4 ]]

ip -o -6 a s | awk -F'[ /]+' '$2!~/lo/{print $4}'

To show only the IP of a specific interface, in case you get more than one result:

ip -o -4 a s eth0 | awk -F'[ /]+' '$2!~/lo/{print $4}' ip -o -4 a s wlan0 | awk -F'[ /]+' '$2!~/lo/{print $4}'
command foo bar | sudo tee /etc/write-protected > /dev/null
git rev-list --all|tail -n1|xargs git show|grep -v diff|head -n1|cut -f1-3 -d' '
git rev-list --all|tail -n1|xargs git show|grep -v diff|head -n1|cut -f1-3 -d' '