Commands by renan2112 (1)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Get all URLs from webpage via Regular Expression
Get all URLs from website via Regular Expression... You must have lynx installed in your computer to execute the command. --> lynx --dump "" | egrep -o "" - Must substitute it for the website path that you want to extract the URLs - Regular Expression that you wanna filter the website

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Sort by IP address

Shows you how many hours of avi video you have.
midentify.sh is part of mplayer, but you might have to locate it on your box.

HTTP GET request on wireshark remotly

Print the contents of $VARIABLE, six words at a time
Print out the contents of $VARIABLE, six words per line, ignoring any single or double quotes in the text. Useful when $VARIABLE contains a sentence that changes periodically, and may or may not contain quoted text.

Convert multiple flac files to mp3
make sure that flac and lame are installed sudo apt-get install lame flac

Size (in bytes) of all RPM packages installed
This command will output the size of all RPM packages and string them together into one enormous addition command which will be calculated by the echo $(( ))

Find out how old a web page is
I used to use the Firefox "View page info" feature a lot to determine how stale the web page I was looking at was. Now that I use mostly Chrome I miss that feature, so here is a command line alternative using wget. The -S says to display the server response, the --spider says to not download any files/pages, just fetch the header. The output goes to stderr, so to grep it you use 2>&1 to combine the stderr stream with stdout, the pipe that to grep for Last-Modified. You can use curl instead if you have it installed, like this: $ curl --head -s http://osswin.sourceforge.net | grep Mod

Hardlink all identical files in the current directory (regain some disk space)
Meaning of switches (see man page too): v verbose p ignore mode (permissions) o ignore owner, group t ignore time of modification Disadvantage: If you modify any linked file, this will propagate to all other files which occupy the same space.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: