All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

List latest 5 modified files recursively
The output format is given by the -printf parameter: %T@ = modify time in seconds since Jan. 1, 1970, 00:00 GMT, with fractional part. Mandatory, hidden in the end. %TY-%Tm-%Td %TH:%TM:%.2TS = modify time as YYYY-MM-DD HH:MM:SS. Optional. %p = file path Refer to http://linux.die.net/man/1/find for more about -printf formatting. ------------------------ sort -nr = sort numerically and reverse (higher values - most recent timestamp - first) head -n 5 = get only 5 first lines (change 5 to whatever you want) cut -f2- -d" " = trim first field (timestamp, used only for sorting) ------------------------ Very useful for building scripts for detecting malicious files upload and malware injections.

Detect illegal access to kernel space, potentially useful for Meltdown detection
Based on capsule8 agent examples, not rigorously tested

To print a specific line from a file
Just one character longer than the sed version ('FNR==5' versus -n 5p). On my system, without using "exit" or "q", the awk version is over four times faster on a ~900K file using the following timing comparison: $ testfile="testfile"; for cmd in "awk 'FNR==20'" "sed -n '20p'"; do echo; echo $cmd; eval "$cmd $testfile"; for i in {1..3}; do time for j in {1..100}; do eval "$cmd $testfile" >/dev/null; done; done; done Adding "exit" or "q" made the difference between awk and sed negligible and produced a four-fold improvement over the awk timing without the "exit". For long files, an exit can speed things up: $ awk 'FNR==5{print;exit}'

list files recursively by size

Backup all MySQL Databases to individual files
Backs up all databases, excluding test, mysql, performance_schema, information_schema. Requires parallel to work, install parallel on Ubuntu by running: sudo aptitude install parallel

list files recursively by size

Convert images to a multi-page pdf
The linux package imagmagick is required for this command

Squish repeated delimiters into one
This can be particularly useful used in conjunction with a following cut command like $echo "hello::::there" | tr -s ':' | cut -d':' -f2 which prints 'there'. Much easier that guessing at -f values for cut. I know 'tr -s' is used in lots of commands here already but I just figured out the -s flag and thought it deserved to be highlighted :)

Backup all starred repositories from Github

List the biggest accessible files/dirs in current directory, sorted


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: