Commands by jangdong (0)

  • bash: commands not found

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Sort disk usage from directories
Sort disk usage from directories in the current directory

Get the full path of a bash script's Git repository head.
Rather than complicated and fragile paths relative to a script like "../../other", this command will retrieve the full path of the file's repository head. Safe with spaces in directory names. Works within a symlinked directory. Broken down: $cd "$(dirname "${BASH_SOURCE[0]}")" temporarily changes directories within this expansion. Double quoted "$(dirname" and ")" with unquoted ${BASH_SOURCE[0]} allows spaces in the path. $git rev-parse --show-toplevel gets the full path of the repository head of the current working directory, which was temporarily changed by the "cd".

Find all videos under current directory
Uses mime-type of files rather than relying on file extensions to find files of a certain type. This can obviously be extended to finding files of any other type as well.. like plain text files, audio, etc.. In reference to displaying the total hours of video (which was earlier posted in command line fu, but relied on the user having to supply all possible video file formats) we can now do better: $ find ./ -type f -print0 | xargs -0 file -iNf - | grep video | cut -d: -f1 | xargs -d'\n' /usr/share/doc/mplayer/examples/midentify | grep ID_LENGTH | awk -F "=" '{sum += $2} END {print sum/60/60; print "hours"}'

list files recursively by size

list files recursively by size

apt-get upgrade with bandwidth limit
in Debian-based systems apt-get could be limited to the specified bandwidth in kilobytes using the apt configuration options(man 5 apt.conf, man apt-get). I'd quote man 5 apt.conf: "The used bandwidth can be limited with Acquire::http::Dl-Limit which accepts integer values in kilobyte. The default value is 0 which deactivates the limit and tries uses as much as possible of the bandwidth..." "HTTPS URIs. Cache-control, Timeout, AllowRedirect, Dl-Limit and proxy options are the same as for http..."

Copy via tar pipe while preserving file permissions (cp does not!; run this command with root!)
cp options: -p will preserve the file mode, ownership, and timestamps -r will copy files recursively also, if you want to keep symlinks in addition to the above: use the -a/--archive option

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

pretend to be busy in office to enjoy a cup of coffee
combination of several of the above

diff two unsorted files without creating temporary files
bash/ksh subshell redirection (as file descriptors) used as input to diff


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: