All commands (14,187)


  • 5
    mencoder "mf://*.jpg" -mf fps=8 -o ./video.avi -ovc lavc
    valessiobrito · 2010-04-22 02:26:13 3
  • Request all information about my IP address in json format


    1
    curl ifconfig.me/all/json
    truemilk · 2010-04-21 20:47:17 5
  • Request all information about my IP address in xml format


    -1
    curl ifconfig.me/all/xml
    truemilk · 2010-04-21 20:45:17 3
  • I like to label my grub boot options with the correct kernel version/build. After building and installing a new kernel with "make install" I had to edit my grub.conf by hand. To avoid this, I've decided to write this little command line to: 1. read the version/build part of the filename to which the kernel symlinks point 2. replace the first label lines of grub.conf grub.conf label lines must be in this format: Latest [{name}-{version/build}] Old [{name}-{version/build}] only the {version/build} part is substituted. For instance: title Latest [GNU/Linux-2.6.31-gentoo-r10.201003] would turn to title Latest [GNU/Linux-2.6.32-gentoo-r7.201004]"


    1
    LATEST=`readlink /boot/vmlinuz`; OLD=`readlink /boot/vmlinuz.old`; cat /boot/grub/grub.conf | sed -i -e 's/\(Latest \[[^-]*\).*\]/\1-'"${LATEST#*-}"]'/1' -e 's/\(Old \[[^-]*\).*\]/\1-'"${OLD#*-}"]'/1' /boot/grub/grub.conf
    algol · 2010-04-21 19:16:51 6

  • 4
    dig +short NS org.
    dpoblador · 2010-04-21 15:10:47 3
  • curl ifconfig.me/ip -> IP Adress curl ifconfig.me/host -> Remote Host curl ifconfig.me/ua ->User Agent curl ifconfig.me/port -> Port thonks to http://ifconfig.me/


    276
    curl ifconfig.me
    aajjk · 2010-04-21 13:10:33 81
  • Find if $b is in $a in bash Show Sample Output


    2
    if [ "x${a/$b/}" != "x$a" ]; then echo "'$b' is in '$a'"; fi
    raphink · 2010-04-21 12:37:26 4
  • Find if $b is in $a in bash Show Sample Output


    2
    if grep -q "$b" <<<$a; then echo "'$b' was found in '$a'"; fi
    raphink · 2010-04-21 12:24:24 12
  • If you have used bash for any scripting, you've used the date command alot. It's perfect for using as a way to create filename's dynamically within aliases,functions, and commands like below.. This is actually an update to my first alias, since a few commenters (below) had good observations on what was wrong with my first command. # creating a date-based ssh-key for askapache.github.com ssh-keygen -f ~/.ssh/`date +git-$USER@$HOSTNAME-%m-%d-%g` -C 'webmaster@askapache.com' # /home/gpl/.ssh/git-gplnet@askapache.github.com-04-22-10 # create a tar+gzip backup of the current directory tar -czf $(date +$HOME/.backups/%m-%d-%g-%R-`sed -u 's/\//#/g' <<< $PWD`.tgz) . # tar -czf /home/gpl/.backups/04-22-10-01:13-#home#gpl#.rr#src.tgz . I personally find myself having to reference date --help quite a bit as a result. So this nice alias saves me a lot of time. This is one bdash mofo. Works in sh and bash (posix), but will likely need to be changed for other shells due to the parameter substitution going on.. Just extend the sed command, I prefer sed to pretty much everything anyways.. but it's always preferable to put in the extra effort to go for as much builtin use as you can. Otherwise it's not a top one-liner, it's a lazyboy recliner. Here's the old version: alias dateh='date --help|sed "/^ *%%/,/^ *%Z/!d;s/ \+/ /g"|while read l;do date "+ %${l/% */}_${l/% */}_${l#* }";done|column -s_ -t' This trick from my [ http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html bash_profile ] Show Sample Output


    21
    alias dateh='date --help|sed -n "/^ *%%/,/^ *%Z/p"|while read l;do F=${l/% */}; date +%$F:"|'"'"'${F//%n/ }'"'"'|${l#* }";done|sed "s/\ *|\ */|/g" |column -s "|" -t'
    AskApache · 2010-04-21 01:22:18 16
  • SH

    cat mod_log_config.c | shmore or shmore < mod_log_config.c Most pagers like less, more, most, and others require additional processes to be loaded, additional cpu time used, and if that wasn't bad enough, most of them modify the output in ways that can be undesirable. What I wanted was a "more" pager that was basically the same as running: cat file Without modifying the output and without additional processes being created, cpu used, etc. Normally if you want to scroll the output of cat file without modifying the output I would have to scroll back my terminal or screen buffer because less modifies the output. After looking over many examples ranging from builtin cat functions created for csh, zsh, ksh, sh, and bash from the 80's, 90s, and more recent examples shipped with bash 4, and after much trial and error, I finally came up with something that satisifed my objective. It automatically adjusts to the size of your terminal window by using the LINES variable (or 80 lines if that is empty) so This is a great function that will work as long as your shell works, so it will work just find if you are booted in single user mode and your /usr/bin directory is missing (where less and other pagers can be). Using builtins like this is fantastic and is comparable to how busybox works, as long as your shell works this will work. One caveat/note: I always have access to a color terminal, and I always setup both the termcap and the terminfo packages for color terminals (and/or ncurses and slang), so for that reason I stuck the tput setab 4; tput setaf 7 command at the beginning of the function, so it only runs 1 time, and that causes the -- SHMore -- prompt to have a blue background and bright white text. This is one of hundreds of functions I have in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html">.bash_profile at http://www.askapache.com/">AskApache.com, but actually won't be included till the next update. If you can improve this in any way at all please let me know, I would be very grateful! ( Like one thing I want is to be able to continue to the next screen by pressing any key instead of now having to press enter to continue) Show Sample Output


    6
    shmore(){ local l L M="`echo;tput setab 4&&tput setaf 7` --- SHMore --- `tput sgr0`";L=2;while read l;do echo "${l}";((L++));[[ "$L" == "${LINES:-80}" ]]&&{ L=2;read -p"$M" -u1;echo;};done;}
    AskApache · 2010-04-21 00:40:37 30
  • copy some file from xx.m3u to target folder


    1
    more xx.m3u |grep -v "^#" |xargs -i cp {} target
    lishuai860113 · 2010-04-20 23:49:16 3
  • Taking file with ip ranges, each on it's own line like: cat ipranges.txt 213.87.86.160-213.87.86.193 213.87.87.0-213.87.88.255 91.135.210.0-91.135.210.255 command returns deaggregated ip ranges using ipcalc deaggregate feature like that: 213.87.86.160/27 213.87.86.192/31 213.87.87.0/24 213.87.88.0/24 91.135.210.0/24 Useful for configuring nginx geo module Show Sample Output


    7
    /bin/grep - ipranges.txt | while read line; do ipcalc $line ; done | grep -v deag
    tf8 · 2010-04-20 21:13:00 4
  • Sometimes a program refuses to read a file and you're not sure why. You may have display_errors turned off for PHP or something. In this example, fopen('/var/www/test/foo.txt') was called but doesn't have read access to foo.txt. Strace can tell you what went wrong. E.g., if php doesn't have read access to the file, strace will say "EACCESS (Permission denied)". Or, if the file path you gave doesn't exist, strace will say "ENOENT (No such file or directory)", etc. This works for any program you can run from the command-line, e.g., strace python myapp.py -e open,access... Note: the above command uses php-cli, not mod_php, which is a different SAPI with diff configs, etc. Show Sample Output


    7
    strace php tias.php -e open,access 2>&1 | grep foo.txt
    rkulla · 2010-04-20 19:42:42 6
  • not the best, uses 4 pipes!


    1
    tr -d "\n\r" | grep -ioEm1 "<title[^>]*>[^<]*</title" | cut -f2 -d\> | cut -f1 -d\<
    bandie91 · 2010-04-20 18:55:24 3
  • previous version leaves lots of blank lines


    1
    awk 'BEGIN{IGNORECASE=1;FS="<title>|</title>";RS=EOF} {print $2}' | sed '/^$/d' > file.html
    tamouse · 2010-04-20 13:27:47 4
  • This is a 'nocd' alternative :)


    2
    ssh user@<source_host> -- tar cz <path> | ssh user@<destination_host> -- tar vxzC <path>
    dranan · 2010-04-20 12:30:49 3

  • 2
    mkdir $(date +%F)
    pusakat · 2010-04-20 12:20:23 5
  • Case Insensitive! and Works even if the "<title>...</title>" spans over multiple line. Simple! :-) Show Sample Output


    4
    awk 'BEGIN{IGNORECASE=1;FS="<title>|</title>";RS=EOF} {print $2}' file.html
    sata · 2010-04-20 10:54:03 5
  • Very handy if you have done a package selection mistake in aptitude. Note that it's better to do a Ctrl+U (undo) in aptitude if possible, because the keep-all will clear some package states (like the 'hold' state).


    1
    aptitude keep-all
    dooblem · 2010-04-20 09:24:20 7
  • You can do some boolean logic like A or B then C else D using or : || and : && So you can do some : # false || false && echo true || echo false false # true || false && echo true || echo false true # false || true && echo true || echo false true # true || true && echo true || echo false true and so on ... I use it like : (ssh example.com 'test something') || $(ssh example.net 'test something') && echo ok || echo ko Show Sample Output


    0
    true || false && echo true || echo false
    Sizeof · 2010-04-20 09:17:08 4

  • 59
    strace -ff -e trace=write -e write=1,2 -p SOME_PID
    oernii2 · 2010-04-20 08:55:54 11
  • Zsync is an implementation of rsync over HTTP that allows updating of files from a remote Web server without requiring a full download. For example, if you already have a Debian alpha, beta or RC copy downloaded, zsync can just download the updated bits of the new release of the file from the server. This requires the distributor of the file to have created a zsync build control file (using zsyncmake).


    2
    zsync -i existing-file-on-disk.iso http://example.com/new-release.iso.zsync
    rkulla · 2010-04-20 07:02:37 8
  • Run this command as root to get enough stats. It works on AMD and Intel machines, including desktops. If ran on a laptop it'll give you suggestions on extending your battery life. You'll need to install PowerTOP if you don't have, via 'apt-get install powertop', etc. To grep the output use: sudo powertop -d | grep ... The many command suggestions PowerTOP gives you alone will increase your command-line fu! Show Sample Output


    3
    sudo powertop
    rkulla · 2010-04-19 21:59:29 10

  • 1
    X='pattern'; vim +/"$X" `egrep -lr "$X" *`
    putnamhill · 2010-04-19 20:36:53 6
  • just a bit simpler


    4
    echo $ascii | perl -ne 'printf "%x", ord for split //'
    linuxrawkstar · 2010-04-19 11:57:08 4
  • ‹ First  < 379 380 381 382 383 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

how many pages will my text files print on?
This gives a very rough estimate of how many pages your text files will print on. Assumes 60 lines per page, and does not take long lines into account.

Find and display most recent files using find and perl
This pipeline will find, sort and display all files based on mtime. This could be done with find | xargs, but the find | xargs pipeline will not produce correct results if the results of find are greater than xargs command line buffer. If the xargs buffer fills, xargs processes the find results in more than one batch which is not compatible with sorting. Note the "-print0" on find and "-0" switch for perl. This is the equivalent of using xargs. Don't you love perl? Note that this pipeline can be easily modified to any data produced by perl's stat operator. eg, you could sort on size, hard links, creation time, etc. Look at stat and just change the '9' to what you want. Changing the '9' to a '7' for example will sort by file size. A '3' sorts by number of links.... Use head and tail at the end of the pipeline to get oldest files or most recent. Use awk or perl -wnla for further processing. Since there is a tab between the two fields, it is very easy to process.

list files recursively by size

Find Duplicate Files (based on size first, then MD5 hash)
for OS X

Scans for open ports using telnet

Efficient count files in directory (no recursion)
$ time perl -e 'if(opendir D,"."){@a=readdir D;print $#a - 1,"\n"}' 205413 real 0m0.497s user 0m0.220s sys 0m0.268s $ time { ls |wc -l; } 205413 real 0m3.776s user 0m3.340s sys 0m0.424s ********* ** EDIT: turns out this perl liner is mostly masturbation. this is slightly faster: $ find . -maxdepth 1 | wc -l sh-3.2$ time { find . -maxdepth 1|wc -l; } 205414 real 0m0.456s user 0m0.116s sys 0m0.328s ** EDIT: now a slightly faster perl version $ perl -e 'if(opendir D,"."){++$c foreach readdir D}print $c-1,"\n"' sh-3.2$ time perl -e 'if(opendir D,"."){++$c foreach readdir D}print $c-1,"\n"' 205414 real 0m0.415s user 0m0.176s sys 0m0.232s

Remove executable bit from all files in the current directory recursively, excluding other directories
With GNU chmod at least it is that simple.

Check the current price of Bitcoin in USD

Find the most recently changed files (recursively)


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: