Easiest way to obtain the busiest website list (sorted by number of process running). Show Sample Output
List the busiest scripts/files running on a cPanel server with domain showing (column $12). Show Sample Output
Add date time to output whithin the current directory Show Sample Output
最常使用的10个命令 Show Sample Output
If we've many files containing (?, ?, ?, ?, ? ) characters instead of ?, ?,... etc,... we can ue this simple command line running a sed command inside a for loop searching for files containing that characters. Hope u like it! Enjoy! ;) Show Sample Output
This is a quick way to find what is hogging disk space when you get a full disk alert on your monitoring system. This won't work as is with filesystems that allow embedded spaces in user names or groups (read "Mac OS X attached to a Windows Domain"). In those cases, you will need to change the -k 5 to something that works in your situation. Show Sample Output
Good for when you download youtube videos and want the mp3 for your mp3 player.
Filter entries in OpenSuse /var/log/messages like: timestamp servername kernel: [83242.108090] btrfs: checksum error at logical 1592344576 on dev /dev/sda5, sector 5223584, root 5, inode 2652, offset 282624, length 4096, links 1 (path: log/warn) Show Sample Output
btrfs checksum errors console report. Show Sample Output
To get the list of tickets in a comma-separated list, pipe the above into:
sort | uniq | perl -pe 's/\n/, /' | sed 's/, $//'
Show Sample Output
Original command: cat "log" | grep "text to grep" | awk '{print $1}' | sort -n | uniq -c | sort -rn | head -n 100 This is a waste of multiple cats and greps, esp when awk is being used
Accepts multiple files via logs...
. Substitute "text to grep" for your search string.
If you want to alias this, you could do something like this:
alias parse-logs='awk "/$1/{print \$1}" ${@[@]:1} | sort -n | uniq -c | sort -rn | head -n 100'
Find the failed lines, reverse the output because I only see 3 indicators after the IP address, i.e. port, port#, ssh2 (in my file), cut to the 4th field (yes, you could awk '{print $4}'), reverse the output back to normal and then sort -u (for uniq, or sort | uniq). Show Sample Output
This command makes a small graph with the histogram of size blocks (5MB in this example), not individual files. Fine tune the 4+5*int($1/5) block for your own size jumps : jump-1+jump*($1/jump) Also in the hist=hist-5 part, tune for bigger or smaller graphs Show Sample Output
If you want to see your top ten cpu using processes from the browser (e.g. you don't want to ssh into your server all the time for checking system load) you can run this command and browse to the machines ip on port 8888. For example 192.168.0.100:8888 Show Sample Output
Applies 'docker rm' to all container IDs that appear in 'docker ps -a' but not in 'docker ps' - i.e. the ones that are not running. Show Sample Output
this command can be added to crontab so as to execute a nightly backup of directories and store only the 10 last backup files.
I occasionally need to see if a machine is hitting ulimit for threads, and what process is responsible. This gives me the total number, sorted low to high so the worst offender is at the end, then gives me the total number of threads, for convenience.
Replace "Oct 2" in the first grep pattern to be the date to view branch work from Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: