commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:
The command will help to print the location of the pattern. Above command will print all the files which contain variable "$foo" along with line containing that pattern.
Specify pattern after "grep"
All folders, human-readable, no subfolder, with a total. Even shorter.
If you add the bookmarklet to your browser's bookmarks with like say, the keyword 'cfu', you can for example type 'cfu hello' in the location bar and the %s gets replaced with 'hello'.
The bookmarklet will convert the search text to base64 for use with the commandlinefu website and will take you there. Tested with Firefox.
remove all carriage return of a given file (or input, if used with | ) and replace them with a space (or whatever character is after %s)
Quick and easy way of validating a date format of yyyy-mm-dd and returning a boolean, the regex can easily be upgraded to handle "in betweens" for mm dd or to validate other types of strings, ex. ip address.
Boolean output could easily be piped into a condition for a more complete one-liner.
This command adds the numbers 10, 12, 14 to a bunch of mp3's in the current working directory. You can then run the command replacing the inital i=10 with i=11 to add 11,13,15 in another directory then mv the files together and the first files interweave with the second group of files. I used this to weave a backlog of a podcast with other podcast so I didn't get sick of one while I was catching up. I started at 10 because printf blows up with 0 padded numbers 08 and 09 which kind of makes the printf command redundant as it was used to pad numbers 1 - 9 so they would come first and not get sorted incorrectly
Returns the index of the last element in the array.
Requires html2text. Print bad, but often funny commit messages from whatthecommit.com
If you have many screen sessions, it can be difficult to find the id of the one you just detached from so you can re-attach using `screen -x -S `
The only pre-requisite is jq (and curl, obviously).
The other version used grep, but jq is much more suited to JSON parsing than that.
This runs a command continuously, restarting it if it exits. Sort of a poor man's daemontools. Useful for running servers from the command line instead of inittab.
This is a simple but useful command to search for multiple terms in a file at once. This prevents you from having to do mutliple grep's of the same file.
One of my friends committed his code in the encoding of GB2312, which broke the build job. I have to find his code and convert.
Sometimes you might need to have two copies of data that is in tar. You might unpack, and then copy, but if IO is slow, you might lower it by automatically writing it twice (or more times)
Add -n to last command to restrict to last num logins, otherwise it will pull all available history.