commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:
Sometimes top/htop don't give the fine-grained detail on memory usage you might need. Sum up the exact memory types you want
Here?s the idea: Submit a one-liner that returns a value or string usable for monitoring something. The more interesting/important, the better.
Tag your one-liners with CLFUContest to enter. Whether you?re participating or not, be sure to vote on the other submissions. The top 5 contest entries by vote count will receive a $10 Amazon gift certificate. On top of that, we?ll select our 3 favorite entries to receive $25 Amazon gift certificates. The prizes might even overlap! Feel free to enter as many times as you like. Check out the URL above for the fine print.
centos list directories sorted by size
-A INPUT -p udp -m udp --dport 10000:66000 -m mac --mac-source 3E:D7:88:A6:66:8E -j ACCEPT
-A INPUT -p udp -m udp --dport 5060 -m mac --mac-source 3E:D7:88:A6:66:8E -j ACCEPT
-A INPUT -p tcp --dport 22 -m mac --mac-source 3E:D7:88:A6:66:8E -j ACCEPT
The difference between the original version provided and this one is that this one works rather than outputting a wget error
A more efficient way, with reversed order to put the focus in the big ones.
I always forget this one and find all kinds of complex solutions on google. Also works great while piping data. ex. 'cat data | process-data | tr -d "\"" > processed-data-without-quotes'
I had to compress it a bit to meet the 255 limit. See sample for full command (274)
Supports 3 arguments (optional)
ffgif filename seek_time time_duration scale
ffgif foo 10 5 320 will seek 10 seconds in, convert for 5 seconds at a 320 scale.
Default will convert whole video to gif at 320 scale.
For all lines, sum the columns following the first one, and then print the first column plus the sum of all the other columns.
Save all output to a log.
This command will disable a guest user logon, this user don't have password to login in the system.
For times when netcat isn't available.
Will throw a Connection refused message if a port is closed.
(: </dev/tcp/127.0.0.1/80) &>/dev/null && echo "OPEN" || echo "CLOSED"
Take the header line from a comma-delimited CSV file and enumerate the fields.
First sed replaces all commas with newlines
Then sed quits (q) after the first line.
Finally, nl numbers all the lines
use , as field separator
deletes all spaces
loops over all input fields and print their index and value
exit after first line
Useful to identify the field number in big CSV files with large number of fields. The index is the reference to use in processing with commands like 'cut' or 'awk' involved.
This utilizes the Requests and BeautifulSoup libraries in Python to retrieve a user page on commandlinefu, parse it (error-tolerant) and extract all the lines of the following format:
To print them, a list comprehension is used to iterate over the values, and join() is called on a newline character.
Convert RAW files (eg. .CR2) to JPEGs, PNGs and whatnot.