Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 12,331 results
sudo grub-install --recheck /dev/sda1
grep -lir "sometext" * > sometext_found_in.log
2009-08-31 23:48:45
User: shaiss
Functions: grep
Tags: find text
1

I find this format easier to read if your going through lots of files. This way you can open the file in any editor and easily review the file

cut -f N- file.dat
strace <name of the program>
awk '{print substr($0, index($0,$N))}'
2009-08-31 19:47:10
User: mstoecker
Functions: awk
0

This command will print all fields from the given input to the end of each line, starting with the Nth field.

touch /tmp/file ; $EXECUTECOMMAND ; find /path -newer /tmp/file
2009-08-31 18:47:19
User: matthewdavis
Functions: find touch
22

This has helped me numerous times trying to find either log files or tmp files that get created after execution of a command. And really eye opening as to how active a given process really is. Play around with -anewer, -cnewer & -newerXY

wget -nv http://en.wikipedia.org/wiki/Linux -O- | egrep -o "http://[^[:space:]]*.jpg" | xargs -P 10 -r -n 1 wget -nv
2009-08-31 18:37:33
User: syssyphus
Functions: egrep wget xargs
10

xargs can be used in this manner to download multiple files at a time, and xargs will in this case run 10 processes at a time and initiate a new one when the number running falls below 10.

dsh -M -c -f servers -- "command HERE"
2009-08-31 12:08:38
User: foob4r
Tags: ssh multiple
4

dsh - Distributed shell, or dancer?s shell ;-)

you can put your servers into /etc/dsh/machines.list than you don't have to serperate them by commata or group them in different files and only run commands for this groups

dsh -M -c -a -- "apt-get update"

scutil --dns
B <<< $(A)
yum --nogpgcheck install "examplePackage"
2009-08-30 18:18:30
User: iDen
Functions: install
-1

Same as:

1 rpm -ivh package.rpm

2 yum localinstall package.rpm

3 Edit /etc/yum.conf or repository.repo and change the value of gpgcheck from 1 to 0 (!dangerous)

awk '{print $1}' /var/log/httpd/access_log | sort | uniq -c | sort -rnk1 | head -n 10
httpd -S
*/15 * * * * /path/to/command
2009-08-30 14:53:08
User: sharfah
-9

Instead of using:

0,15,30,45 * * * * /path/to/command

xev
2009-08-30 14:41:16
User: linuxswords
3

for mousevents, move the mouse over the window and click/move etc.

usefull for getting mouseKeys, or keyKeys. also usefull for checking if X gets those mouse-events.

wget --reject html,htm --accept pdf,zip -rl1 url
2009-08-30 14:05:09
User: linuxswords
Functions: wget
16

If the site uses https, use:

wget --reject html,htm --accept pdf,zip -rl1 --no-check-certificate https-url
gpg --search-keys
mpg123 -s input.mp3 | faac -b 80 -P -X -w -o output.m4b -
2009-08-30 13:15:37
User: linuxswords
Functions: mpg123
1

to convert a whole directory, put all mp3 files in a for loop

for i in $(ls *mp3); do mpg123 -s $i | faac -b 80 -P -X -w -o ${i%mp3}m4b -; done
rkhunter --check
2009-08-30 12:53:33
User: unixbhaskar
Tags: Security shell
-2

rkhunter (Rootkit Hunter) is a Unix-based tool that scans for rootkits, backdoors and possible local exploits. rkhunter is a shell script which carries out various checks on the local system to try and detect known rootkits and malware. It also performs checks to see if commands have been modified, if the system startup files have been modified, and various checks on the network interfaces, including checks for listening applications.

chkrootkit -x | less
curl -s http://tinyurl.com/create.php?url=http://<website.url>/ | sed -n 's/.*\(http:\/\/tinyurl.com\/[a-z0-9][a-z0-9]*\).*/\1/p' | uniq
xmms2 pause && echo "xmms2 play" | at now +5min
2009-08-30 04:35:10
User: Vrekk
Functions: at echo
2

you can also run "xmms2 pause & at now +5min

sw_vers
ZIP=48104; curl http://thefuckingweather.com/?zipcode=$ZIP 2>/dev/null|grep -A1 'div class="large"'|tr '\n' ' '|sed 's/^.*"large" >\(..\)/\1/;s/&d.* <br \/>/ - /;s/<br \/>//;s/<\/div.*$//'
2009-08-29 19:33:35
User: sleepynate
Functions: grep sed tr
1

grab the weather, with a little expletive fun. replace the 48104 with a US zipcode, or the name of your city (such as ZIP="oslo"), unless you want to know what the weather is like for me (and that's fine too)

httpd2 -V