All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Alert on high ping to know if it's really laggy while playing
Online games have pretty good lag compensation nowadays, Sometimes though, you really want to get some warning about your latency, e.g. while playing Diablo III in Hardcore mode, so you know when to carefully quit the game b/c your flatmate started downloading all his torrents at once. This is done on Darwin. On Linux/*nix you would need to find another suitable command instead of `say` to spell out your latency. And I used fping because it's a little bit easier to get the latency value needed. Something similar with our regular ping command could look like this: $ while :; do a=$(ping -c1 google.com | grep -o 'time.*' | cut -d\= -f2 | cut -d\ -f1 | cut -b1-4); [[ $a > 40 ]] && say "ping is $a"; sleep 3; done

list files recursively by size

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Limit memory usage per script/program
When I'm testing some scripts or programs, they end up using more memory than anticipated. In that case, computer nearly halts due to swap space usage, and sometimes I have to press Magic SysRq+REISUB to reboot. So, I was looking for a way to limit memory usage per script and found out that ulimit can limit memory. If you run it this way: $ $ ulimit -v 1000000 . $ $ scriptname Then the new memory limit will be valid for that shell. I think changing the limit within a subshell is much more flexible and it won't interfere with your current shell ulimit settings. note: -v 1000000 corresponds to approximately 1GB of RAM

notify yourself when a long-running command which has ALREADY STARTED is finished
If you want to be notified when a long-running command is finished, but you have already started it: CTRL+Z $ fg; echo "finished" | sendmail me@example.com I use a script to post a tweet, which sends me a txt: $ fg; echo "finished" | tweet

list files recursively by size

Convert a date to timestamp
Simple way to get a timestamp from a date

disable caps lock
a quick one-line way to disable caps lock while running X.

wget with resume
I couldn't find this on the site and it's a useful switch. Great for large files.

Random unsigned integer


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: