All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Encrypt and password-protect execution of any bash script
(Please see sample output for usage) script.bash is your script, which will be crypted to script.secure script.bash --> script.secure You can execute script.secure only if you know the password. If you die, your script dies with you. If you modify the startup line, be careful with the offset calculation of the crypted block (the XX string). Not difficult to make script editable (an offset-dd piped to a gpg -d piped to a vim - piped to a gpg -c directed to script.new ), but not enough space to do it on a one liner.

Fastest segmented parallel sync of a remote directory over ssh
Mirror a remote directory using some tricks to maximize network speed. lftp:: coolest file transfer tool ever -u: username and password (pwd is merely a placeholder if you have ~/.ssh/id_rsa) -e: execute internal lftp commands set sftp:connect-program: use some specific command instead of plain ssh ssh:: -a -x -T: disable useless things -c arcfour: use the most efficient cipher specification -o Compression=no: disable compression to save CPU mirror: copy remote dir subtree to local dir -v: be verbose (cool progress bar and speed meter, one for each file in parallel) -c: continue interrupted file transfers if possible --loop: repeat mirror until no differences found --use-pget-n=3: transfer each file with 3 independent parallel TCP connections -P 2: transfer 2 files in parallel (totalling 6 TCP connections) sftp://remotehost:22: use sftp protocol on port 22 (you can give any other port if appropriate) You can play with values for --use-pget-n and/or -P to achieve maximum speed depending on the particular network. If the files are compressible removing "-o Compression=n" can be beneficial. Better create an alias for the command.

Print stack trace of a core file without needing to enter gdb interactively
This does almost the same thing as the original, but it runs the full backtrace for _all_ the threads, which is pretty important when reporting a crash for a multithreaded software, since more often than not, the signal handler is executed in a different thread than the crash happened.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Debug redirects between production reloads
Watches the headers of a curl, following any redirects and printing only the HTTP status and the location of the possible redirects.

Install pip with Proxy
Installs pip packages defining a proxy

list with full path

bash-quine
this command prints itself out. it doesn't need to be stored in a file and it isn't as easy as $ echo $BASH_COMMAND for information on quines see http://en.wikipedia.org/wiki/Quine_(computing)

Paste command output to www.pastehtml.com in txt format.
paste(){ curl -s -S --data-urlencode "txt=$($@)" "http://pastehtml.com/upload/create?input_type=txt&result=address";echo;}

Convert all .wav to .mp3
Audio convert is a script that uses zenity and lame to transcode virtually any format to any other format provided you have the libraries installed to do so.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: