commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Renames files eliminating suffix, in this case everything after "-" is cutted. Just change "-" with the character you need.
chrome only lets you export in html format, with a lot of table junk, this command will just export the titles of the links and the links without all that extra junk
Processes biglion quantity of sold ebay coupons/bonus codes, so you can know approximate count of users who buyed the coupons and when sales are come up again.
You can change sleep parameter so script will work slowly or faster (default is 5 seconds).
Additional requirements: curl
Standart tools used: awk, date, cat, grep (bash)
A bit shorter and parallelized. Depending on the speed of your cpu and your disk this may run faster.
Parallel is from https://savannah.nongnu.org/projects/parallel/
This will show you the permissions on the directory you are currently in
Short and sweet command. This command is also useful for other information such as what IP address a particular user logged in from, how long had they been logged in, what shell do they use.
Thanks for the submit! My alternative produces summaries only for directories. The original post additionally lists all files in the current directory. Sometimes the files, they just clutter up the output. Once the big directory is located, *then* worry about which file(s) are consuming so much space.
replace the "-" by the character you wish. If you have multiple extentions, like jpef, jpg and JPG you could use
mmv "*-*.*" "#1.#3"
If you do not have shuf or an -R option in sort, you can fall back on awk. This provides maximum portability IMO. The command first collects words from the dictionary that match the criteria - in this case: lower case words with no punctuation that are 4 to 8 characters long. It then prints 4 random entries. I decided to print each word on a separate line to improve readability.
The command was too long for the command box, so here it is:
echo $(( `wget -qO - http://i18n.counter.li.org/ | grep 'users registered' | sed 's/.*\<font size=7\>//g' | tr '\>' ' ' | sed 's/<br.*//g' | tr ' ' '\0'` + `curl --silent http://www.dudalibre.com/gnulinuxcounter?lang=en | grep users | head -2 | tail -1 | sed 's/.*<strong>//g' | sed 's/<\/strong>.*//g'` ))
This took me about an hour to do. It uses wget and curl because, dudalibre.com blocks wget, and wget worked nicely for me.
Grabs the current weather in your area (or their best guess of your area). Change the query to your zip code/location (e.g. google.com/search?q=weather+jakarta,+india) to get weather somewhere else. change google.com to google.ca or google.co.uk for metric.
List all disks and all of their partitions on OS X. http://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/man8/diskutil.8.html
A slightly shorter version. Also doesn't put a return character at the end of the password
If you know the URL of a file on a SharePoint server, it's just a matter of logging in with your AD credentials to get the file with cURL