Extract all href links from an HTML document with sed and grep Show Sample Output
Linux only Show Sample Output
- The last sed expression ensures the unicast/multicast bit is set to zero - The greedy space replacements are for portability across UNIX seds (note there's TWO spaces and not just one, as this web shows, in s/^ */) Show Sample Output
This command will find the highest context switches on a server and give you the process listing. Show Sample Output
This probably only works without modifications in RHEL/CentOS/Fedora. Show Sample Output
Get the current cpu % usage on your system. Show Sample Output
This command can rename all files in a folder changing all the dots in the filename for dashes, but respecting the final dot for the extension. Show Sample Output
No params Show Sample Output
Good for when your working on building a clean source install for RPM packaging or what have you. After testing, run this command to compare the original extracted source to your working source directory and it will remove the differences that are created when running './configure' and 'make'.
Tries to avoid the fragile nature of scrapers by looking for user-input in the output as opposed to markup or headers on the web site. Show Sample Output
This fixes the extra lines you get when you request only 1 paragraph using a little bit of grep. Just set p to the number of paragraphs you want. Show Sample Output
Retrieves the current WAN ipv4 address via checkip.dyn.com. Show Sample Output
find files recursively from the current directory, and list the extensions of files uniquely Show Sample Output
Take a screenshot, give $1 seconds pause to choose what to screenshot, then upload and get URI of post in ompdlr.org Show Sample Output
This command downloads the actual 20 most popular pictures from the website 500px. It uses a random name due to the fact the the pictures in 500px are stored with the same name. UPDATED: doesn't work if no referrer is specified: --referer='http://500px.com/'
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: