Commands by kyle0r (4)

  • In this example, where the users gpg keyring has a password, the user will be interactively prompted for the keyring password. If the keyring has no password, same as above, sans the prompt. Suitable for cron jobs. ~/.gnupg/passwd/http-auth.gpg is the encrypted http auth password, for this particular wget use case. This approach has many use cases. example bash functions: function http_auth_pass() { gpg2 --decrypt ~/.gnupg/passwd/http-auth.gpg 2>/dev/null; } function decrypt_pass() { gpg2 --decrypt ~/.gnupg/passwd/"$1" 2>/dev/null; }


    1
    wget --input-file=~/donwloads.txt --user="$USER" --password="$(gpg2 --decrypt ~/.gnupg/passwd/http-auth.gpg 2>/dev/null)"
    kyle0r · 2012-12-13 00:14:55 6
  • From the cwd, recursively find all rar files, extracting each rar into the directory where it was found, rather than cwd. A nice time saver if you've used wget or similar to mirror something, where each sub dir contains an rar archive. Its likely this can be tuned to work with multi-part archives where all parts use ambiguous .rar extensions but I didn't test this. Perhaps unrar would handle this gracefully anyway?


    10
    find . -name '*.rar' -execdir unrar e {} \;
    kyle0r · 2012-09-27 02:27:03 4
  • the find -printf "%f\n" prints just the file name from the given path. This means directory paths which contain extensions will not be considered. Show Sample Output


    0
    find /some/path -type f -and -printf "%f\n" | egrep -io '\.[^.]*$' | sort | uniq -c | sort -rn
    kyle0r · 2012-04-02 19:25:35 3
  • In this example, the command will recursively find files (-type f) under /some/path, where the path ends in .mp3, case insensitive (-iregex). It will then output a single line of output (-print0), with results terminated by a the null character (octal 000). Suitable for piping to xargs -0. This type of output avoids issues with garbage in paths, like unclosed quotes. The tr command then strips away everything but the null chars, finally piping to wc -c, to get a character count. I have found this very useful, to verify one is getting the right number of before you actually process the results through xargs or similar. Yes, one can issue the find without the -print0 and use wc -l, however if you want to be 1000% sure your find command is giving you the expected number of results, this is a simple way to check. The approach can be made in to a function and then included in .bashrc or similar. e.g. count_chars() { tr -d -c "$1" | wc -c; } In this form it provides a versatile character counter of text streams :) Show Sample Output


    1
    find /some/path -type f -and -iregex '.*\.mp3$' -and -print0 | tr -d -c '\000' |wc -c
    kyle0r · 2012-03-31 21:57:33 4

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Most used commands from history (without perl)
I copied this (let's be honest) somewhere on internet and I just made it as a function ready to be used as alias. It shows the 10 most used commands from history. This seems to be just another "most used commands from history", but hey.. this is a function!!! :D

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

save date and time for each command in history
Date-time format: YYYY-MM-DD HH:MM:SS

Write comments to your history.
A null operation with the name 'comment', allowing comments to be written to HISTFILE. Prepending '#' to a command will *not* write the command to the history file, although it will be available for the current session, thus '#' is not useful for keeping track of comments past the current session.

sort lines by length
making it "sound" more "natural" language like -- additionally sorting the longest words alphabetically: this approach is using: * to get at all lines of input * post-"for" structure * short-circuit-or in sort: if the lengths are the same, then sort alphabetically otherwise don't even evaluate the right hand side of the or * -C sets all input and ouput channels to utf8

create a simple version of ls with extended output
create a short alias for 'ls' with multi-column (-C), file type syntax additions (slashes after directories, @ for symlinks, etc... (-F), long format (-l), including hidden directories (all ./, ../, .svn, etc) (-a), show file-system blocks actually in use (-s), human readable file sizes (-h)

List detailed information about a ZIP archive
list zipfile info in long Unix ``ls -l'' format.

Recursively remove .svn directories from the current location

check open ports without netstat or lsof

Create a zip file ignoring .svn files


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: