All commands (14,187)

  • find all email addresses in a file, printing each match. Addresses do not have to be alone on a line etc. For example you can grab them from HTML-formatted emails or CSV files, etc. Use a combination of ...|sort|uniq$ to filter them. Show Sample Output


    3
    grep -Eio '([[:alnum:]_.]+@[[:alnum:]_]+?\.[[:alpha:].]{2,6})' file.html
    wires · 2009-06-16 20:19:47 9
  • Search for the string "search" and replace it with the string "replace", on all files with the extension php in the curret folder. Do also a backup of each file with the extension "bkp".


    4
    ruby -i.bkp -pe "gsub(/search/, 'replace')" *.php
    gustavgans · 2009-06-16 12:35:40 6
  • Good for summing the numbers embedded in text - a food journal entry for example with calories listed per food where you want the total calories. Use this to monitor and keep a total on anything that ouputs numbers. Show Sample Output


    3
    perl -ne '$sum += $_ for grep { /\d+/ } split /[^\d\-\.]+/; print "$sum\n"'
    obscurite · 2009-06-16 06:39:08 6
  • This command will open up the two files in FileMerge on OS X. You can also compare two directories. opendiff directory1 directory2 NOTE: FileMerge is a part of the OS X Developer Tools, available on the install disc.


    -3
    opendiff <file1> <file2>
    claytron · 2009-06-16 03:22:52 7
  • Tells sort to ignore all characters before the Xth position in the first field per line. If you have a list of items one per line and want to ignore the first two characters for sorting purposes, you would type "sort -k1.3". Change the "1" to change the field being sorted. The decimal value is the offset in the specified field to sort by.


    3
    sort -k1.x
    leper421 · 2009-06-16 00:04:21 7
  • If you're users have ever asked your script to email their reports in separate attachments instead of tar'ring them into one file, then you can use this. You'll need the mailx package of course. In Unix you'd want to add an additional parameter "-m" (uuencode foo.txt foo.txt; uuencode /etc/passwd passwd.txt)|mailx -m -s "Hooosa!" someone@cmdfu.com


    1
    (uuencode foo.txt foo.txt; uuencode /etc/passwd passwd.txt)|mailx -s "Pandaren!" someone@cmdfu.com
    LrdShaper · 2009-06-15 11:34:51 7
  • Uses process signal to play next selection


    1
    killall -2 mpg321
    dattaway · 2009-06-15 03:04:00 6
  • Pressing ESC then * will insert in the command line the results of the autocompletion. It's hard to explain but if you look the sample output or do echo ESC * you will understand quickly. By the way, few reminders about ESC : - Hold ESC does the same thing as tab tab - 'ESC .' inserts the last argument of last command (can be done many times in order to get the last argument of all previous commands) Show Sample Output


    67
    ESC *
    Josay · 2009-06-14 21:17:40 51
  • curl doesn't provide url-encoding for 'GET' data, it have an option '--data-urlencode', but its only for 'POST' data. Thats why I need to write down this commandline. With 'perl', 'php' and 'python', this is one liner, but just I wrote it for fun. Works in Ubuntu, will work in all linux varients(I hope it will work in unix varients also). Show Sample Output


    -3
    (Command too long..See sample Output..)
    mohan43u · 2009-06-14 20:34:37 135
  • This will extract all of the urls from a firefox session (including urls in a tab's history). The sessionstore.js file is in ~/.mozilla/firefox/{firefox profile}


    2
    sed -e "s/\[{/\n/g" -e "s/}, {/\n/g" sessionstore.js | grep url | awk -F"," '{ print $1 }'| sed -e "s/url:\"\([^\"]*\)\"/\1/g" -e "/^about:blank/d" > session_urls.txt
    birnam · 2009-06-14 15:08:31 10
  • Share your "now playing" Amarok song in twitter!


    6
    curl -u <user>:<password> -d status="Amarok, now playing: $(dcop amarok default nowPlaying)" http://twitter.com/statuses/update.json
    caiosba · 2009-06-14 02:42:34 6
  • Simple way to achieve a colored SVN diff


    16
    svn diff <file> | vim -R -
    caiosba · 2009-06-13 22:00:49 69

  • 4
    mysql -uadmin -p` cat /etc/psa/.psa.shadow` -Dpsa -e"select mail_name,name,password from mail left join domains on mail.dom_id = domains.id inner join accounts where mail.account_id = accounts.id;"
    BADmd · 2009-06-13 21:19:18 4
  • Obviously, you can replace 'man' command with any command in this command line to do useful things. I just want to mention that there is a way to list all the commands which you can execute directly without giving fullpath. Normally all important commands will be placed in your PATH directories. This commandline uses that variable to get commands. Works in Ubuntu, will work in all 'manpage' configured *nix systems. Show Sample Output


    4
    find `echo "${PATH}" | tr ':' ' '` -type f | while read COMMAND; do man -f "${COMMAND##*/}"; done
    mohan43u · 2009-06-13 19:56:24 6
  • You can convert any UNIX man page to .txt


    15
    man ls | col -b > ~/Desktop/man_ls.txt
    vigo · 2009-06-13 11:49:33 18
  • It's also possible to delay the extraction (echo "unrar e ... fi" |at now+20 minutes) wich is really convenient!


    4
    unrar e file.part1.rar; if [ $? -eq 0 ]; then rm file.part*.rar; fi
    mrttlemonde · 2009-06-13 11:11:43 7
  • Yes, it's useless.


    4
    a=`printf "%*s" 16`;b=${a//?/{0..1\}}; echo `eval "echo $b"`
    rhythmx · 2009-06-13 06:32:35 5
  • Chronic Bash function: chronic 3600 time # Print the time in your shell every hour chronic 60 updatedb > /dev/null # update slocate every minute Note: use 'jobs' to list background tasks and fg/bg to take control of them.


    3
    chronic () { t=$1; shift; while true; do $@; sleep $t; done & }
    rhythmx · 2009-06-13 05:57:54 16
  • You can flexibly change file pattern(*.tar.gz) and uncompress command to other job! Example, remove all files : for i in *.tar.gz; do rm $i; done (Just for example, because if you really want to remove file, simply use wildcard like this rm *.tar.gz)


    0
    for i in *.tar.gz; do tar -xzf $i; done
    kureikain · 2009-06-13 03:58:48 11

  • 0
    egrep -r '(render_message|multipart).*('`find app/views -name '*.erb' | grep mailer | sed -e 's/\..*//' -e 's/.*\///' | uniq | xargs | sed 's/ /|/g'`')' app/models
    foobarfighter · 2009-06-12 18:53:29 5
  • Find statistics for an Edirectory server form LDAPsearch. We have a lot more examples at: http://ldapwiki.willeke.com/wiki/Ldapsearch%20Examples The full command got shut off it is: ldapsearch -h ldapserver.willeke.com -p636 -e C:\mydata\treerootcert.der -b "" -s base -D cn=admin,ou=administration,dc=willeke,dc=com -w secretpwd "(objectclass=*)" chainings removeEntryOps referralsReturned listOps modifyRDNOps repUpdatesIn repUpdatesOut strongAuthBinds addEntryOps compareOps wholeSubtreeSearchOps modifyEntryOps searchOps errors simpleAuthBinds inOps oneLevelSearchOps inBytes abandonOps bindSecurityErrors securityErrors unAuthBinds outBytes extendedOps readOps dsaName directoryTreeName vendorVersion vendorName Show Sample Output


    1
    ldapsearch -h ldapserver.willeke.com -p389 -b "" -s base -D cn=admin,ou=administration,dc=willeke,dc=com -w secretpwd "(objectclass=*)" chainings removeEntryOps referralsReturned listOps modifyRDNOps repUpdatesIn repUpdatesOut strongAuthBinds addEntryOps
    jwilleke · 2009-06-12 13:28:18 4
  • Performs a mysqldump and gzip-compresses the output file with a timestamp in the resulting dump file. Inspect the file for integrity or fun with this command afterward, if you desire: zcat mysqldump-2009-06-12-07.41.01.tgz | less Show Sample Output


    1
    mysqldump [options] |gzip ->mysqldump-$(date +%Y-%m-%d-%H.%M.%S).gz
    linuxrawkstar · 2009-06-12 12:42:59 6
  • I've been auto-generating some complex GnuPlots; with multiplots the first plot of each group needs to be a 'plot' whereas the others need to be 'replots' to allow overplotting/autoscaling/etc to work properly. This is used to replace only the first instance of 'replot'. Show Sample Output


    5
    sed '/MARKER/{N;s/THIS/THAT/}'
    mungewell · 2009-06-12 02:29:50 4
  • command | my_irc Pipe whatever you want to this function, it will, if everything goes well, be redirected to a channel or a user on an IRC server. Please note that : - I am not responsible of flood excesses you might provoke. - that function does not reply to PINGs from the server. That's the reason why I first write in a temporary file. Indeed, I don't want to wait for inputs while being connected to the server. However, according to the configuration of the server and the length of your file, you may timeout before finishing. - Concerning the server, the variable content must be on the form "irc.server.org 6667" (or any other port). If you want to make some tests, you can also create a fake IRC server on "localhost 55555" by using netcat -l -p 55555 - Concerning the target, you can choose a channel (beginning with a '#' like "#chan") or a user (like "user") - The other variables have obvious names. Show Sample Output


    1
    function my_irc { tmp=`mktemp`; cat > $tmp; { echo -e "USER $username x x :$ircname\nNICK $nick\nJOIN $target"; while read line; do echo -e "PRIVMSG $target :$line"; done < $tmp; } | nc $server > /dev/null ; rm $tmp; }
    Josay · 2009-06-11 22:14:48 7
  • Nice reading in the morning on the way to work, but sadly the .tar.gz for the whole issue 66 is not on phrack's website yet. So use wget to download.


    2
    mkdir phrack66; (cd phrack66; for n in {1..17} ; do echo "http://www.phrack.org/issues.html?issue=66&id=$n&mode=txt" ; done | xargs wget)
    masterofdisaster · 2009-06-11 21:42:42 5
  • ‹ First  < 485 486 487 488 489 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

preserve disk; keep OS clean
if you use disk-based swap then it can defeat the purpose of this function.

Switch to a user with "nologin" shell
You need sudo privileges for this command. Replace username with actual username.

Force wrap all text to 80 columns in Vim
This is assuming that you're editing some file that has not been wrapped at 80 columns, and you want it to be wrapped. While in Vim, enter ex mode, and set the textwidth to 80 columns: $ :set textwidth=80 Then, press: $ gg to get to the top of the file, and: $ gqG to wrap every line from the top to the bottom of the file at 80 characters. Of course, this will lose any indentation blocks you've setup if typing up some source code, or doing type setting. You can make modifications to this command as needed, as 'gq' is the formatting command you want, then you could send the formatting to a specific line in the file, rather than to the end of the file. $ gq49G Will apply the format from your current cursor location to the 49th row. And so on.

A command line calculator in Perl
Once I wrote a command line calculator program in C, then I found this... and added to it a bit. For ease of use I normally use this in a tiny Perl program (which I call pc for 'Perl Calculator') #!/usr/bin/perl -w die "Usage: $0 MATHS\n" unless(@ARGV);for(@ARGV){s/x/*/g;s/v/sqrt /g;s/\^/**/g}; print eval(join('',@ARGV)),$/; It handles square roots, power, modulus: $ pc 1+2 (1 plus 2) 3 $ pc 3x4 (3 times 4) 12 $ pc 5^6 (5 to the power of 6) 15625 $ pc v 49 ( square root of 49 ) 7 $ pc 12/3 (12 divided by 3) 4 $ pc 19%4 (19 modulus 4) 3 (you can string maths together too) $ pc 10 x 10 x 10 1000 $ pc 10 + 10 + 10 / 2 25 $ pc 7 x v49 49

live netcat network throughput test
On the another machine write this command. pv -r /dev/zero | nc 192.168.1.1 7777 It will show live throughput between two machine.The destination machine ip is at our example 192.168.1.1 You must multiply by 8 for the network calculation. You must install pv and netcat commands for this commands usage. kerim@bayner.com http://www.bayner.com/

Show log message including which files changed for a given commit in git.

Print the 16 most recent RPM packages installed in newest to oldest order

Testing php configuration

Get a stream feed from a Twitter user
*** CAREFULLY READ THE NOTES **** *** THIS DOES NOT WORK "OUT OF THE BOX" *** You'll need a few minutes of CAREFUL reading before making your own Twitter feed: In 2010 simple command line Twitter feed requests all stopped working because Twitter upgraded to SSL security. Https requests for a filtered Twitter stream feed now require a special header called "oauth_header". The benefit is that your stream feed and login info is securely encrypted. The bad news is that an "oauth_header" takes some work to build. Fortunately, four functions, imaginatively named step1, step2, step3 and step4 can be used to build a customized oauth_header for you in a few minutes. Now, go look at "step1" to start creating your own oauth_header!

VIM: Go back to the last place you were in a document
You're perhaps editing a line, or reading a certain line of code, you use page up and down or move through the file and now you wish to return to the last position the cursor was at. '' will get you there.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: