All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Write comments to your history.
A null operation with the name 'comment', allowing comments to be written to HISTFILE. Prepending '#' to a command will *not* write the command to the history file, although it will be available for the current session, thus '#' is not useful for keeping track of comments past the current session.

How to backup hard disk timely?
'data' is the directory to backup, 'backup' is directory to store snapshots. Backup files on a regular basis using hard links. Very efficient, quick. Backup data is directly available. Same as explained here : http://blog.interlinked.org/tutorials/rsync_time_machine.html in one line. Using du to check the size of your backups, the first backup counts for all the space, and other backups only files that have changed.

Show one line summaries of all DEB packages installed on Ubuntu based on pattern search
I sometimes want to know what packages are installed on my Ubuntu system. I still haven't figured out how to use aptitude effectively, so this is the next best thing. This allows finding by name. The grep '^ii' limits the display to only installed packages. If this is not specified, then it includes listing of non-installed packages as well.

Advanced python tracing
Trace python statement execution and syscalls invoked during that simultaneously

list all files in a directory, sorted in reverse order by modification time, use file descriptors.
It's both silly, and infinitely useful. Especially useful in logfile directories where you want to know what file is being updated while troubleshooting.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Print every Nth line
Sometimes commands give you too much feedback. Perhaps 1/100th might be enough. If so, every() is for you. $ my_verbose_command | every 100 will print every 100th line of output. Specifically, it will print lines 100, 200, 300, etc If you use a negative argument it will print the *first* of a block, $ my_verbose_command | every -100 It will print lines 1, 101, 201, 301, etc The function wraps up this useful sed snippet: $ ... | sed -n '0~100p' don't print anything by default $ sed -n starting at line 0, then every hundred lines ( ~100 ) print. $ '0~100p' There's also some bash magic to test if the number is negative: we want character 0, length 1, of variable N. $ ${N:0:1} If it *is* negative, strip off the first character ${N:1} is character 1 onwards (second actual character).

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Beep siren
Infinitely plays beeps with sinusoidally changing sound frequency. Ideal for alarm on an event.

Transfer large files/directories with no overhead over the network
This invokes tar on the remote machine and pipes the resulting tarfile over the network using ssh and is saved on the local machine. This is useful for making a one-off backup of a directory tree with zero storage overhead on the source. Variations on this include using compression on the source by using 'tar cfvp' or compression at the destination via $ ssh user@host "cd dir; tar cfp - *" | gzip - > file.tar.gz


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: