All commands (14,187)

  • find all email addresses in a file, printing each match. Addresses do not have to be alone on a line etc. For example you can grab them from HTML-formatted emails or CSV files, etc. Use a combination of ...|sort|uniq$ to filter them. Show Sample Output


    3
    grep -Eio '([[:alnum:]_.]+@[[:alnum:]_]+?\.[[:alpha:].]{2,6})' file.html
    wires · 2009-06-16 20:19:47 9
  • Search for the string "search" and replace it with the string "replace", on all files with the extension php in the curret folder. Do also a backup of each file with the extension "bkp".


    4
    ruby -i.bkp -pe "gsub(/search/, 'replace')" *.php
    gustavgans · 2009-06-16 12:35:40 6
  • Good for summing the numbers embedded in text - a food journal entry for example with calories listed per food where you want the total calories. Use this to monitor and keep a total on anything that ouputs numbers. Show Sample Output


    3
    perl -ne '$sum += $_ for grep { /\d+/ } split /[^\d\-\.]+/; print "$sum\n"'
    obscurite · 2009-06-16 06:39:08 6
  • This command will open up the two files in FileMerge on OS X. You can also compare two directories. opendiff directory1 directory2 NOTE: FileMerge is a part of the OS X Developer Tools, available on the install disc.


    -3
    opendiff <file1> <file2>
    claytron · 2009-06-16 03:22:52 7
  • Tells sort to ignore all characters before the Xth position in the first field per line. If you have a list of items one per line and want to ignore the first two characters for sorting purposes, you would type "sort -k1.3". Change the "1" to change the field being sorted. The decimal value is the offset in the specified field to sort by.


    3
    sort -k1.x
    leper421 · 2009-06-16 00:04:21 7
  • If you're users have ever asked your script to email their reports in separate attachments instead of tar'ring them into one file, then you can use this. You'll need the mailx package of course. In Unix you'd want to add an additional parameter "-m" (uuencode foo.txt foo.txt; uuencode /etc/passwd passwd.txt)|mailx -m -s "Hooosa!" someone@cmdfu.com


    1
    (uuencode foo.txt foo.txt; uuencode /etc/passwd passwd.txt)|mailx -s "Pandaren!" someone@cmdfu.com
    LrdShaper · 2009-06-15 11:34:51 7
  • Uses process signal to play next selection


    1
    killall -2 mpg321
    dattaway · 2009-06-15 03:04:00 6
  • Pressing ESC then * will insert in the command line the results of the autocompletion. It's hard to explain but if you look the sample output or do echo ESC * you will understand quickly. By the way, few reminders about ESC : - Hold ESC does the same thing as tab tab - 'ESC .' inserts the last argument of last command (can be done many times in order to get the last argument of all previous commands) Show Sample Output


    67
    ESC *
    Josay · 2009-06-14 21:17:40 53
  • curl doesn't provide url-encoding for 'GET' data, it have an option '--data-urlencode', but its only for 'POST' data. Thats why I need to write down this commandline. With 'perl', 'php' and 'python', this is one liner, but just I wrote it for fun. Works in Ubuntu, will work in all linux varients(I hope it will work in unix varients also). Show Sample Output


    -3
    (Command too long..See sample Output..)
    mohan43u · 2009-06-14 20:34:37 136
  • This will extract all of the urls from a firefox session (including urls in a tab's history). The sessionstore.js file is in ~/.mozilla/firefox/{firefox profile}


    2
    sed -e "s/\[{/\n/g" -e "s/}, {/\n/g" sessionstore.js | grep url | awk -F"," '{ print $1 }'| sed -e "s/url:\"\([^\"]*\)\"/\1/g" -e "/^about:blank/d" > session_urls.txt
    birnam · 2009-06-14 15:08:31 11
  • Share your "now playing" Amarok song in twitter!


    6
    curl -u <user>:<password> -d status="Amarok, now playing: $(dcop amarok default nowPlaying)" http://twitter.com/statuses/update.json
    caiosba · 2009-06-14 02:42:34 6
  • Simple way to achieve a colored SVN diff


    16
    svn diff <file> | vim -R -
    caiosba · 2009-06-13 22:00:49 69

  • 4
    mysql -uadmin -p` cat /etc/psa/.psa.shadow` -Dpsa -e"select mail_name,name,password from mail left join domains on mail.dom_id = domains.id inner join accounts where mail.account_id = accounts.id;"
    BADmd · 2009-06-13 21:19:18 4
  • Obviously, you can replace 'man' command with any command in this command line to do useful things. I just want to mention that there is a way to list all the commands which you can execute directly without giving fullpath. Normally all important commands will be placed in your PATH directories. This commandline uses that variable to get commands. Works in Ubuntu, will work in all 'manpage' configured *nix systems. Show Sample Output


    4
    find `echo "${PATH}" | tr ':' ' '` -type f | while read COMMAND; do man -f "${COMMAND##*/}"; done
    mohan43u · 2009-06-13 19:56:24 6
  • You can convert any UNIX man page to .txt


    15
    man ls | col -b > ~/Desktop/man_ls.txt
    vigo · 2009-06-13 11:49:33 18
  • It's also possible to delay the extraction (echo "unrar e ... fi" |at now+20 minutes) wich is really convenient!


    4
    unrar e file.part1.rar; if [ $? -eq 0 ]; then rm file.part*.rar; fi
    mrttlemonde · 2009-06-13 11:11:43 7
  • Yes, it's useless.


    4
    a=`printf "%*s" 16`;b=${a//?/{0..1\}}; echo `eval "echo $b"`
    rhythmx · 2009-06-13 06:32:35 5
  • Chronic Bash function: chronic 3600 time # Print the time in your shell every hour chronic 60 updatedb > /dev/null # update slocate every minute Note: use 'jobs' to list background tasks and fg/bg to take control of them.


    3
    chronic () { t=$1; shift; while true; do $@; sleep $t; done & }
    rhythmx · 2009-06-13 05:57:54 16
  • You can flexibly change file pattern(*.tar.gz) and uncompress command to other job! Example, remove all files : for i in *.tar.gz; do rm $i; done (Just for example, because if you really want to remove file, simply use wildcard like this rm *.tar.gz)


    0
    for i in *.tar.gz; do tar -xzf $i; done
    kureikain · 2009-06-13 03:58:48 11

  • 0
    egrep -r '(render_message|multipart).*('`find app/views -name '*.erb' | grep mailer | sed -e 's/\..*//' -e 's/.*\///' | uniq | xargs | sed 's/ /|/g'`')' app/models
    foobarfighter · 2009-06-12 18:53:29 5
  • Find statistics for an Edirectory server form LDAPsearch. We have a lot more examples at: http://ldapwiki.willeke.com/wiki/Ldapsearch%20Examples The full command got shut off it is: ldapsearch -h ldapserver.willeke.com -p636 -e C:\mydata\treerootcert.der -b "" -s base -D cn=admin,ou=administration,dc=willeke,dc=com -w secretpwd "(objectclass=*)" chainings removeEntryOps referralsReturned listOps modifyRDNOps repUpdatesIn repUpdatesOut strongAuthBinds addEntryOps compareOps wholeSubtreeSearchOps modifyEntryOps searchOps errors simpleAuthBinds inOps oneLevelSearchOps inBytes abandonOps bindSecurityErrors securityErrors unAuthBinds outBytes extendedOps readOps dsaName directoryTreeName vendorVersion vendorName Show Sample Output


    1
    ldapsearch -h ldapserver.willeke.com -p389 -b "" -s base -D cn=admin,ou=administration,dc=willeke,dc=com -w secretpwd "(objectclass=*)" chainings removeEntryOps referralsReturned listOps modifyRDNOps repUpdatesIn repUpdatesOut strongAuthBinds addEntryOps
    jwilleke · 2009-06-12 13:28:18 4
  • Performs a mysqldump and gzip-compresses the output file with a timestamp in the resulting dump file. Inspect the file for integrity or fun with this command afterward, if you desire: zcat mysqldump-2009-06-12-07.41.01.tgz | less Show Sample Output


    1
    mysqldump [options] |gzip ->mysqldump-$(date +%Y-%m-%d-%H.%M.%S).gz
    linuxrawkstar · 2009-06-12 12:42:59 6
  • I've been auto-generating some complex GnuPlots; with multiplots the first plot of each group needs to be a 'plot' whereas the others need to be 'replots' to allow overplotting/autoscaling/etc to work properly. This is used to replace only the first instance of 'replot'. Show Sample Output


    5
    sed '/MARKER/{N;s/THIS/THAT/}'
    mungewell · 2009-06-12 02:29:50 4
  • command | my_irc Pipe whatever you want to this function, it will, if everything goes well, be redirected to a channel or a user on an IRC server. Please note that : - I am not responsible of flood excesses you might provoke. - that function does not reply to PINGs from the server. That's the reason why I first write in a temporary file. Indeed, I don't want to wait for inputs while being connected to the server. However, according to the configuration of the server and the length of your file, you may timeout before finishing. - Concerning the server, the variable content must be on the form "irc.server.org 6667" (or any other port). If you want to make some tests, you can also create a fake IRC server on "localhost 55555" by using netcat -l -p 55555 - Concerning the target, you can choose a channel (beginning with a '#' like "#chan") or a user (like "user") - The other variables have obvious names. Show Sample Output


    1
    function my_irc { tmp=`mktemp`; cat > $tmp; { echo -e "USER $username x x :$ircname\nNICK $nick\nJOIN $target"; while read line; do echo -e "PRIVMSG $target :$line"; done < $tmp; } | nc $server > /dev/null ; rm $tmp; }
    Josay · 2009-06-11 22:14:48 7
  • Nice reading in the morning on the way to work, but sadly the .tar.gz for the whole issue 66 is not on phrack's website yet. So use wget to download.


    2
    mkdir phrack66; (cd phrack66; for n in {1..17} ; do echo "http://www.phrack.org/issues.html?issue=66&id=$n&mode=txt" ; done | xargs wget)
    masterofdisaster · 2009-06-11 21:42:42 5
  • ‹ First  < 485 486 487 488 489 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Decode base64-encoded file in one line of Perl
Another option is openssl.

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Wait for file to stop changing
Here's a way to wait for a file (a download, a logfile, etc) to stop changing, then do something. As written it will just return to the prompt, but you could add a "; echo DONE" or whatever at the end. This just compares the full output of "ls" every 10 seconds, and keeps going as long as that output has changed since the last interval. If the file is being appended to, the size will change, and if it's being modified without growing, the timestamp from the "--full-time" option will have changed. The output of just "ls -l" isn't sufficient since by default it doesn't show seconds, just minutes. Waiting for a file to stop changing is not a very elegant or reliable way to measure that some process is finished - if you know the process ID there are much better ways. This method will also give a false positive if the changes to the target file are delayed longer than the sleep interval for any reason (network timeouts, etc). But sometimes the process that is writing the file doesn't exit, rather it continues on doing something else, so this approach can be useful if you understand its limitations.

Create a simple playlist sort by Genre using mp3info

Temporarily ignore known SSH hosts
you may create an alias also, which I did ;-) alias sshu="ssh -o UserKnownHostsFile=/dev/null "

dd with progress bar and statistics to gzipped image
This is a useful command to backup an sd card with relative total size for piping to pv with a progressbar

ThePirateBay.org torrent search
This one-liner greps first 30 direct URLs for .torrent files matching your search querry, ordered by number of seeds (descending; determined by the second number after your querry, in this case 7; for other options just check the site via your favorite web-browser). You don't have to care about grepping the torrent names as well, because they are already included in the .torrent URL (except for spaces and some other characters replaced by underscores, but still human-readable). Be sure to have some http://isup.me/ macro handy (someone often kicks the ethernet cables out of their servers ;) ). I've also coded a more user-friendly ash (should be BASH compatible) script, which also lists the total size of download and number of seeds/peers (available at http://saironiq.blogspot.com/2011/04/my-shell-scripts-4-thepiratebayorg.html - may need some tweaking, as it was written for a router running OpenWrt and transmission). Happy downloading!

print shared library dependencies
May be used on (embedded) systems lack ldd

Create new user with home dir and given password
The crypt function takes a password, key, as a string, and a salt character array which is described below, and returns a printable ASCII string which starts with another salt. It is believed that, given the output of the function, the best way to find a key that will produce that output is to guess values of key until the original value of key is found. from http://en.wikipedia.org/wiki/Crypt_(Unix)


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: