All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Clone all repos from a user with lynx
https://wuseman.github.io/wcloner/

m4a to mp3 conversion with ffmpeg and lame
A batch file version of the same command would be: for f in *.m4a; do ffmpeg -i "$f" -acodec libmp3lame -ab 256k "${f%.m4a}.mp3"; done

Brute force discover
Show the number of failed tries of login per account. If the user does not exist it is marked with *.

Convert all .flac from a folder subtree in 192Kb mp3
find . -type f -iname '*.flac' # searches from the current folder recursively for .flac audio files | # the output (a .flac audio files with relative path from ./ ) is piped to while read FILE; do FILENAME="${FILE%.*}"; flac -cd "$FILE" | lame -b 192 - "${FILENAME}.mp3"; done # for each line on the list: # FILE gets the file with .flac extension and relative path # FILENAME gets FILE without the .flac extension # run flac for that FILE with output piped to lame conversion to mp3 using 192Kb bitrate

Create a new file

Validate and pretty-print JSON expressions.
You can use a site like http://www.jsonlint.com/ or use the command line to validate your long and complex json data. This is part of the simplejson package for python http://undefined.org/python/#simplejson. Wrong json expression example: $ echo '{ 1.2:3.4}' | python -m simplejson.tool Expecting property name: line 1 column 2 (char 2)

The fastest remote directory rsync over ssh archival I can muster (40MB/s over 1gb NICs)
This creates an archive that does the following: rsync:: (Everyone seems to like -z, but it is much slower for me) -a: archive mode - rescursive, preserves owner, preserves permissions, preserves modification times, preserves group, copies symlinks as symlinks, preserves device files. -H: preserves hard-links -A: preserves ACLs -X: preserves extended attributes -x: don't cross file-system boundaries -v: increase verbosity --numeric-ds: don't map uid/gid values by user/group name --delete: delete extraneous files from dest dirs (differential clean-up during sync) --progress: show progress during transfer ssh:: -T: turn off pseudo-tty to decrease cpu load on destination. -c arcfour: use the weakest but fastest SSH encryption. Must specify "Ciphers arcfour" in sshd_config on destination. -o Compression=no: Turn off SSH compression. -x: turn off X forwarding if it is on by default. Flip: rsync -aHAXxv --numeric-ids --delete --progress -e "ssh -T -c arcfour -o Compression=no -x" [source_dir] [dest_host:/dest_dir]

use jq to validate and pretty-print json output
the `jq` tool can also be used do validate json files and pretty print output `cat file.json | jq` available on several platforms, including newer debian-based systems via `#sudo apt install jq`, mac via `brew install jq`, and from source https://stedolan.github.io/jq/download/

Run remote web page, but don't save the results
I have a remote php file that I want to run once an hour. I set up cron to run this wget. I don't really care about what's in the file though, I don't want to save the results, so I run the -O and send it to /dev/null

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: