All commands (14,187)


  • 2
    curl -s http://standards.ieee.org/regauth/oui/oui.txt | grep $1
    prayer · 2010-04-29 23:52:01 27
  • Another way of counting the line output of tail over 10s not requiring pv. Cut to have the average per second rate : tail -n0 -f access.log>/tmp/tmp.log & sleep 10; kill $! ; wc -l /tmp/tmp.log | cut -c-2 You can also enclose it in a loop and send stderr to /dev/null : while true; do tail -n0 -f access.log>/tmp/tmp.log & sleep 2; kill $! ; wc -l /tmp/tmp.log | cut -c-2; done 2>/dev/null


    1
    tail -n0 -f access.log>/tmp/tmp.log & sleep 10; kill $! ; wc -l /tmp/tmp.log
    dooblem · 2010-04-29 21:23:46 16
  • Displays the realtime line output rate of a logfile. -l tels pv to count lines -i to refresh every 10 seconds -l option is not in old versions of pv. If the remote system has an old pv version: ssh tail -f /var/log/apache2/access.log | pv -l -i10 -r >/dev/null


    10
    tail -f access.log | pv -l -i10 -r >/dev/null
    dooblem · 2010-04-29 21:02:01 7
  • or "Execute a command with a timeout" Run a command in background, sleep 10 seconds, kill it. ! is the process id of the most recently executed background command. You can test it with: find /& sleep10; kill $!


    6
    very_long_command& sleep 10; kill $!
    dooblem · 2010-04-29 20:43:13 4
  • kaffeine could be replaced by any player able to read mms stream


    1
    kaffeine $(wget -qO- "http://questions-pour-un-champion.france3.fr/emission/index-fr.php?page=video&type_video=quotidiennes&video_courante=$(date +%Y%m%d)" | grep -o "mms.*wmv" | uniq)
    fbone · 2010-04-29 17:59:06 3

  • 3
    setterm -blength 0
    bandie91 · 2010-04-29 17:52:38 17
  • Need to have rc iso pre-downloaded before running command.


    4
    mv ubuntu-10.04-rc-desktop-amd64.iso ubuntu-10.04-desktop-amd64.iso; i=http://releases.ubuntu.com/10.04/ubuntu-10.04-desktop-amd64.iso.zsync; while true; do if wget $i; then zsync $i; date; break; else sleep 30; fi; done
    stinkerweed999 · 2010-04-29 15:49:43 3
  • Tested with 9.10 release. Choose whatever torrent client you prefer.


    1
    while true; do if wget http://releases.ubuntu.com/10.04/ubuntu-10.04-desktop-i386.iso.torrent; then ktorrent --silent ubuntu-10.04-desktop-i386.iso.torrent ; date; break; else sleep 5m; fi; done
    ppaschka · 2010-04-29 13:22:54 7

  • 2
    curl -s http://www.macvendorlookup.com/getoui.php?mac=$1 | sed -e 's/<[^>]\+>//g'; echo
    bandie91 · 2010-04-29 13:17:30 6
  • The cut should match the relevant timestamp part of the logfile, the uniq will count the number of occurrences during this time interval. Show Sample Output


    5
    grep <something> logfile | cut -c2-18 | uniq -c
    buzzy · 2010-04-29 11:26:09 5
  • Change the cut range for hits per 10 sec, minute and so on... Grep can be used to filter on url or source IP. Show Sample Output


    4
    tail -f access_log | cut -c2-21 | uniq -c
    buzzy · 2010-04-29 11:16:54 6
  • awk is evil! Show Sample Output


    22
    ps hax -o user | sort | uniq -c
    buzzy · 2010-04-29 10:43:03 13
  • There's another version on here that uses GET but some people don't have lwp-request, so here's an alternative. It's also a little shorter and should work with most youtube URLs since it truncates at the first &


    2
    url="[Youtube URL]"; echo $(curl ${url%&*} 2>&1 | grep -iA2 '<title>' | grep '-') | sed 's/^- //'
    rkulla · 2010-04-29 02:03:36 4
  • This is a very simple and lightweight way to play DI.FM stations For a more complete version of the command with proper strings in the menu, try: (couldnt fit in the command field above) zenity --list --width 500 --height 500 --title 'DI.FM' --text 'Pick a Radio' --column 'radio' --column 'url' --print-column 2 $(curl -s http://www.di.fm/ | awk -F '"' '/href="http:.*\.pls.*96k/ {print $2}' | sort | awk -F '/|\.' '{print $(NF-1) " " $0}') | xargs mplayer This command line parses the html returned from http://di.fm and display all radio stations in a nice graphical menu. After the radio is chosen, the url is passed to mplayer so the music can start dependencies: - x11 with gtk environment - zenity: simple app for displaying gtk menus (sudo apt-get install zenity on ubuntu) - mplayer: simple audio player (sudo apt-get install mplayer on ubuntu) Show Sample Output


    16
    zenity --list --width 500 --height 500 --column 'radio' --column 'url' --print-column 2 $(curl -s http://www.di.fm/ | awk -F '"' '/href="http:.*\.pls.*96k/ {print $2}' | sort | awk -F '/|\.' '{print $(NF-1) " " $0}') | xargs mplayer
    polaco · 2010-04-28 23:45:35 14
  • enumerates the number of processes for each user. ps BSD format is used here , for standard Unix format use : ps -eLf |awk '{$1} {++P[$1]} END {for(a in P) if (a !="UID") print a,P[a]}' Show Sample Output


    6
    ps aux |awk '{$1} {++P[$1]} END {for(a in P) if (a !="USER") print a,P[a]}'
    benyounes · 2010-04-28 15:25:18 3
  • Useful command to get information about running java process and treads, to see log look into the default log for your java application


    0
    kill -3 PID
    mrbyte · 2010-04-28 08:22:42 3

  • 1
    pmap $(pgrep [ProcessName] -n) | gawk '/total/ { a=strtonum($2); b=int(a/1024); printf b};'
    lv4tech · 2010-04-28 08:16:28 3
  • Gives you a nice quick summary of how many lines each of your files is comprised of. (In this example, we just check .c, .h, .php and .pl). Since we just use wc -l to count, you'll just get a very rough estimate of how many lines of actual code there are. Use a more sophisticated algorithm instead if you need to. Show Sample Output


    2
    find . \( -iname '*.[ch]' -o -iname '*.php' -o -iname '*.pl' \) -exec wc -l {} \; | sort
    rkulla · 2010-04-28 07:18:21 4
  • The above command will set the GID bit on all directories named .svn in the current directory recursively. This makes the group ownership of all .svn folders be the group ownership for all files created in that folder, no matter the user. This is useful for me as the subversion working directory on my server is also the live website and needs to be auto committed to subversion every so often via cron as well as worked on by multiple users. Setting the GID bit on the .svn folders makes sure we don't have a mix of .svn metadata created by a slew of different users.


    1
    find . -type d -name .svn -exec chmod g+s "{}" \;
    mitzip · 2010-04-27 16:51:00 21
  • This will search all directories and ignore the CVS ones. Then it will search all files in the resulting directories and act on them.


    2
    for dir in $(find -type d ! -name CVS); do for file in $(find $dir -maxdepth 1 -type f); do rm $file; cvs delete $file; done; done
    ubersoldat · 2010-04-27 16:03:33 9
  • The options -b binary and -m are needed for disassembling raw machine code when it is not part of a full binary executable with proper headers. Show Sample Output


    2
    objdump -b binary -m i386 -D shellcode.bin
    recursiverse · 2010-04-27 11:11:36 25

  • 2
    find ~ -maxdepth 2 -name .git -print | while read repo; do cd $(dirname $repo); git pull; done
    l0b0 · 2010-04-27 08:51:52 5
  • This is N5 sorta like rot13 but with numbers only. Encrypt echo "$1" | xxd -p | tr '0-9' '5-90-6' Decrypt echo "$1" | tr '0-9' '5-90-6' | xxd -r -p Show Sample Output


    2
    echo "$1" | xxd -p | tr '0-9' '5-90-6'; echo "$1" | tr '0-9' '5-90-6' | xxd -r -p
    IsraelTorres · 2010-04-27 03:08:47 5
  • Starts the cursor on line X of file foo. Useful for longer files in which it takes a long time to scroll. If X is greater than the number of lines in file foo, it will go to the last existing line.


    6
    nano +X foo
    spiffwalker · 2010-04-27 01:57:58 10
  • Possible simplification of egrep-awk-sort with find and -exec with xargs. Show Sample Output


    2
    find . -maxdepth 1 -type f | xargs stat
    asolkar · 2010-04-26 20:51:54 5
  • ‹ First  < 377 378 379 380 381 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Count occurrences per minute in a log file
The cut should match the relevant timestamp part of the logfile, the uniq will count the number of occurrences during this time interval.

Create md5sum of a directory

Copy from host 1 to host 2 through your host

Display rows and columns of random numbers with awk
Displays six rows and five columns of random numbers between 0 and 1. If you need only one column, you can dispense with the "for" loop.

See system users

Find usb device in realtime
Using this command you can track a moment when usb device was attached.

kill all process that belongs to you

Block known dirty hosts from reaching your machine
Blacklisted is a compiled list of all known dirty hosts (botnets, spammers, bruteforcers, etc.) which is updated on an hourly basis. This command will get the list and create the rules for you, if you want them automatically blocked, append |sh to the end of the command line. It's a more practical solution to block all and allow in specifics however, there are many who don't or can't do this which is where this script will come in handy. For those using ipfw, a quick fix would be {print "add deny ip from "$1" to any}. Posted in the sample output are the top two entries. Be advised the blacklisted file itself filters out RFC1918 addresses (10.x.x.x, 172.16-31.x.x, 192.168.x.x) however, it is advisable you check/parse the list before you implement the rules

List top 100 djs from https://djmag.com/top100djs

Rename all files which contain the sub-string 'foo', replacing it with 'bar'
That is an alternative to command 8368. Command 8368 is EXTREMELY NOT clever. 1) Will break also for files with spaces AND new lines in them AND for an empty expansion of the glob '*' 2) For making such a simple task it uses two pipes, thus forking. 3) xargs(1) is dangerous (broken) when processing filenames that are not NUL-terminated. 4) ls shows you a representation of files. They are NOT file names (for simple names, they mostly happen to be equivalent). Do NOT try to parse it. Why? see this :http://mywiki.wooledge.org/ParsingLs Recursive version: $ find . -depth -name "*foo*" -exec bash -c 'for f; do base=${f##*/}; mv -- "$f" "${f%/*}/${base//foo/bar}"; done' _ {} +


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: