All commands (14,187)


  • 2
    curl -s http://standards.ieee.org/regauth/oui/oui.txt | grep $1
    prayer · 2010-04-29 23:52:01 27
  • Another way of counting the line output of tail over 10s not requiring pv. Cut to have the average per second rate : tail -n0 -f access.log>/tmp/tmp.log & sleep 10; kill $! ; wc -l /tmp/tmp.log | cut -c-2 You can also enclose it in a loop and send stderr to /dev/null : while true; do tail -n0 -f access.log>/tmp/tmp.log & sleep 2; kill $! ; wc -l /tmp/tmp.log | cut -c-2; done 2>/dev/null


    1
    tail -n0 -f access.log>/tmp/tmp.log & sleep 10; kill $! ; wc -l /tmp/tmp.log
    dooblem · 2010-04-29 21:23:46 16
  • Displays the realtime line output rate of a logfile. -l tels pv to count lines -i to refresh every 10 seconds -l option is not in old versions of pv. If the remote system has an old pv version: ssh tail -f /var/log/apache2/access.log | pv -l -i10 -r >/dev/null


    10
    tail -f access.log | pv -l -i10 -r >/dev/null
    dooblem · 2010-04-29 21:02:01 7
  • or "Execute a command with a timeout" Run a command in background, sleep 10 seconds, kill it. ! is the process id of the most recently executed background command. You can test it with: find /& sleep10; kill $!


    6
    very_long_command& sleep 10; kill $!
    dooblem · 2010-04-29 20:43:13 4
  • kaffeine could be replaced by any player able to read mms stream


    1
    kaffeine $(wget -qO- "http://questions-pour-un-champion.france3.fr/emission/index-fr.php?page=video&type_video=quotidiennes&video_courante=$(date +%Y%m%d)" | grep -o "mms.*wmv" | uniq)
    fbone · 2010-04-29 17:59:06 3

  • 3
    setterm -blength 0
    bandie91 · 2010-04-29 17:52:38 17
  • Need to have rc iso pre-downloaded before running command.


    4
    mv ubuntu-10.04-rc-desktop-amd64.iso ubuntu-10.04-desktop-amd64.iso; i=http://releases.ubuntu.com/10.04/ubuntu-10.04-desktop-amd64.iso.zsync; while true; do if wget $i; then zsync $i; date; break; else sleep 30; fi; done
    stinkerweed999 · 2010-04-29 15:49:43 3
  • Tested with 9.10 release. Choose whatever torrent client you prefer.


    1
    while true; do if wget http://releases.ubuntu.com/10.04/ubuntu-10.04-desktop-i386.iso.torrent; then ktorrent --silent ubuntu-10.04-desktop-i386.iso.torrent ; date; break; else sleep 5m; fi; done
    ppaschka · 2010-04-29 13:22:54 7

  • 2
    curl -s http://www.macvendorlookup.com/getoui.php?mac=$1 | sed -e 's/<[^>]\+>//g'; echo
    bandie91 · 2010-04-29 13:17:30 6
  • The cut should match the relevant timestamp part of the logfile, the uniq will count the number of occurrences during this time interval. Show Sample Output


    5
    grep <something> logfile | cut -c2-18 | uniq -c
    buzzy · 2010-04-29 11:26:09 5
  • Change the cut range for hits per 10 sec, minute and so on... Grep can be used to filter on url or source IP. Show Sample Output


    4
    tail -f access_log | cut -c2-21 | uniq -c
    buzzy · 2010-04-29 11:16:54 6
  • awk is evil! Show Sample Output


    22
    ps hax -o user | sort | uniq -c
    buzzy · 2010-04-29 10:43:03 13
  • There's another version on here that uses GET but some people don't have lwp-request, so here's an alternative. It's also a little shorter and should work with most youtube URLs since it truncates at the first &


    2
    url="[Youtube URL]"; echo $(curl ${url%&*} 2>&1 | grep -iA2 '<title>' | grep '-') | sed 's/^- //'
    rkulla · 2010-04-29 02:03:36 4
  • This is a very simple and lightweight way to play DI.FM stations For a more complete version of the command with proper strings in the menu, try: (couldnt fit in the command field above) zenity --list --width 500 --height 500 --title 'DI.FM' --text 'Pick a Radio' --column 'radio' --column 'url' --print-column 2 $(curl -s http://www.di.fm/ | awk -F '"' '/href="http:.*\.pls.*96k/ {print $2}' | sort | awk -F '/|\.' '{print $(NF-1) " " $0}') | xargs mplayer This command line parses the html returned from http://di.fm and display all radio stations in a nice graphical menu. After the radio is chosen, the url is passed to mplayer so the music can start dependencies: - x11 with gtk environment - zenity: simple app for displaying gtk menus (sudo apt-get install zenity on ubuntu) - mplayer: simple audio player (sudo apt-get install mplayer on ubuntu) Show Sample Output


    16
    zenity --list --width 500 --height 500 --column 'radio' --column 'url' --print-column 2 $(curl -s http://www.di.fm/ | awk -F '"' '/href="http:.*\.pls.*96k/ {print $2}' | sort | awk -F '/|\.' '{print $(NF-1) " " $0}') | xargs mplayer
    polaco · 2010-04-28 23:45:35 14
  • enumerates the number of processes for each user. ps BSD format is used here , for standard Unix format use : ps -eLf |awk '{$1} {++P[$1]} END {for(a in P) if (a !="UID") print a,P[a]}' Show Sample Output


    6
    ps aux |awk '{$1} {++P[$1]} END {for(a in P) if (a !="USER") print a,P[a]}'
    benyounes · 2010-04-28 15:25:18 3
  • Useful command to get information about running java process and treads, to see log look into the default log for your java application


    0
    kill -3 PID
    mrbyte · 2010-04-28 08:22:42 3

  • 1
    pmap $(pgrep [ProcessName] -n) | gawk '/total/ { a=strtonum($2); b=int(a/1024); printf b};'
    lv4tech · 2010-04-28 08:16:28 3
  • Gives you a nice quick summary of how many lines each of your files is comprised of. (In this example, we just check .c, .h, .php and .pl). Since we just use wc -l to count, you'll just get a very rough estimate of how many lines of actual code there are. Use a more sophisticated algorithm instead if you need to. Show Sample Output


    2
    find . \( -iname '*.[ch]' -o -iname '*.php' -o -iname '*.pl' \) -exec wc -l {} \; | sort
    rkulla · 2010-04-28 07:18:21 4
  • The above command will set the GID bit on all directories named .svn in the current directory recursively. This makes the group ownership of all .svn folders be the group ownership for all files created in that folder, no matter the user. This is useful for me as the subversion working directory on my server is also the live website and needs to be auto committed to subversion every so often via cron as well as worked on by multiple users. Setting the GID bit on the .svn folders makes sure we don't have a mix of .svn metadata created by a slew of different users.


    1
    find . -type d -name .svn -exec chmod g+s "{}" \;
    mitzip · 2010-04-27 16:51:00 21
  • This will search all directories and ignore the CVS ones. Then it will search all files in the resulting directories and act on them.


    2
    for dir in $(find -type d ! -name CVS); do for file in $(find $dir -maxdepth 1 -type f); do rm $file; cvs delete $file; done; done
    ubersoldat · 2010-04-27 16:03:33 9
  • The options -b binary and -m are needed for disassembling raw machine code when it is not part of a full binary executable with proper headers. Show Sample Output


    2
    objdump -b binary -m i386 -D shellcode.bin
    recursiverse · 2010-04-27 11:11:36 25

  • 2
    find ~ -maxdepth 2 -name .git -print | while read repo; do cd $(dirname $repo); git pull; done
    l0b0 · 2010-04-27 08:51:52 5
  • This is N5 sorta like rot13 but with numbers only. Encrypt echo "$1" | xxd -p | tr '0-9' '5-90-6' Decrypt echo "$1" | tr '0-9' '5-90-6' | xxd -r -p Show Sample Output


    2
    echo "$1" | xxd -p | tr '0-9' '5-90-6'; echo "$1" | tr '0-9' '5-90-6' | xxd -r -p
    IsraelTorres · 2010-04-27 03:08:47 5
  • Starts the cursor on line X of file foo. Useful for longer files in which it takes a long time to scroll. If X is greater than the number of lines in file foo, it will go to the last existing line.


    6
    nano +X foo
    spiffwalker · 2010-04-27 01:57:58 10
  • Possible simplification of egrep-awk-sort with find and -exec with xargs. Show Sample Output


    2
    find . -maxdepth 1 -type f | xargs stat
    asolkar · 2010-04-26 20:51:54 5
  • ‹ First  < 377 378 379 380 381 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Run a command only when load average is below a certain threshold
Good for one off jobs that you want to run at a quiet time. The default threshold is a load average of 0.8 but this can be set using atrun.

Convert spaces in file names to underscores

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

list folders containing less than 2 MB of data
This command will search all subfolders of the current directory and list the names of the folders which contain less than 2 MB of data. I use it to clean up my mp3 archive and to delete the found folders pipe the output to a textfile & run: $ while read -r line; do rm -Rv "$line"; done < textfile

Burn CD/DVD from an iso, eject disc when finished.
cdrecord -scanbus will tell you the (x,y,z) value of your cdr (for example, mine is 3,0,0)

Run a command for blocks of output of another command
The given example collects output of the tail command: Whenever a line is emitted, further lines are collected, until no more output comes for one second. This group of lines is then sent as notification to the user. You can test the example with $ logger "First group"; sleep 1; logger "Second"; logger "group"

ASCII art of yourself
Use libcaca to render ascii chars on the webcam input... or don't.

batch convert Nikon RAW (nef) images to JPG
converts RAW files from a Nikon DSLR to jpg for easy viewing etc. requires ufraw package

a function to create a box of '=' characters around a given string.
First argument: string to put a box around. Second argument: character to use for box (default is '=') Same as command #4948, but shorter, and without the utility function.

Backup all mysql databases to individual files on a remote server
It grabs all the database names granted for the $MYSQLUSER and gzip them to a remote host via SSH.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: