commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
This is what I came up to generate XKCD #936 style four-word password.
Since first letter of every word is capitalized it looks a bit more readable to my eyes.
Also strips single quotes.
And yes - regex is a bit of a kludge, but that's the bes i could think of.
Outputs the real time it takes a Redis ping to run in thousands of a second without any proceeding 0's. Useful for logging or scripted action.
Normally, if you just want to see directories you'd use brianmuckian's command 'ls -d *\', but I ran into problems trying to use that command in my script because there are often multiple directories per line. If you need to script something with directories and want to guarantee that there is only one entry per line, this is the fastest way i know
For instance, if people have signed your key, this will fetch the signers' keys.
If your version of curl does not support the --compressed option, use
curl -s http://funnyjunk.com | gunzip
curl -s --compressed http://funnyjunk.com
Using the sed -i (inline), you can replace the beginning of the first line of a file without redirecting the output to a temporary location.
Uses history to get the last n+1 commands (since this command will appear as the most recent), then strips out the line number and this command using sed, and appends the commands to a file.
Using sed to extract lines in a text file
If you write bash scripts a lot, you are bound to run into a situation where you want to extract some lines from a file. Yesterday, I needed to extract the first line of a file, say named somefile.txt.
This specific task can be easily done with this:
head -1 somefile.txt
For a more complicated task, like extract the second to third lines of a file. head is inadequate.
So, let's try extracting lines using sed: the stream editor.
My first attempt uses the p sed command (for print):
sed 1p somefile.txt
Note that it prints the whole file, with the first line printed twice. Why? The default output behavior is to print every line of the input file stream.
The explicit 1p command just tells it to print the first line .... again.
To fix it, you need to suppress the default output (using -n), making explicit prints the only way to print to default output.
sed -n 1p somefile.txt
Alternatively, you can tell sed to delete all but the first line.
sed '1!d' somefile.txt
'1!d' means if a line is not(!) the first line, delete.
Note that the single quotes are necessary. Otherwise, the !d will bring back the last command you executed that starts with the letter d.
To extract a range of lines, say lines 2 to 4, you can execute either of the following:
sed -n 2,4p somefile.txt
sed '2,4!d' somefile.txt
Note that the comma specifies a range (from the line before the comma to the line after).
What if the lines you want to extract are not in sequence, say lines 1 to 2, and line 4?
sed -n -e 1,2p -e 4p somefile.txt
chrome only lets you export in html format, with a lot of table junk, this command will just export the titles of the links and the links without all that extra junk
urldecode files in current directrory
The tcpdump arguments are just an example.
This command outputs a table of sighting opportunities for the International Space Station. Find the URL for your city here: http://spaceflight.nasa.gov/realdata/sightings/
Select a file/folder at random.
The command removes all the spaces whithin a file and leaves only one space.
Deletes capistrano-style release directories (except that there are dashes between the YYYY-MM-DD)
Same result with simpler regular expression..
Generates the md5 hash, without the trailing " -" and with the output "broken" into pairs of hexs.
This is just a slight alternative that wraps all of #7917 in a function that can be executed