All commands (14,187)

  • This is not recommended... lossy -> lossy = lossier. Still, you can do it! ;)


    0
    find . -iname '*.mp3' | while read song; do mpg321 ${song} -w - | oggenc -q 9 -o ${song%.mp3}.ogg -; done
    renich · 2010-03-14 11:34:35 3
  • Create a shortcut on your desktop and insert the above command.


    -3
    mplayer http://38.100.101.69/CIDCFMAAC
    dtolj · 2010-03-13 17:42:54 3
  • This one uses dictionary.com


    13
    pronounce(){ wget -qO- $(wget -qO- "http://dictionary.reference.com/browse/$@" | grep 'soundUrl' | head -n 1 | sed 's|.*soundUrl=\([^&]*\)&.*|\1|' | sed 's/%3A/:/g;s/%2F/\//g') | mpg123 -; }
    matthewbauer · 2010-03-13 04:23:56 12
  • translate <phrase> <source-language> <output-language> works from command line


    2
    cmd=$( wget -qO- "http://ajax.googleapis.com/ajax/services/language/translate?v=1.0&q=$1&langpair=$2|${3:-en}" | sed 's/.*"translatedText":"\([^"]*\)".*}/\1\n/'; ); echo "$cmd"
    dtolj · 2010-03-13 01:09:00 50
  • Usefull if you want to check if something is applying a dictonary of brute force.


    -2
    more /var/log/auth.log |grep "month"|grep ipop|grep "failed"|wc -l
    efuoax · 2010-03-12 18:48:53 5
  • The original was a little bit too complicated for me. This one does not use any variables.


    5
    pronounce(){ wget -qO- $(wget -qO- "http://www.m-w.com/dictionary/$@" | grep 'return au' | sed -r "s|.*return au\('([^']*)', '([^'])[^']*'\).*|http://cougar.eb.com/soundc11/\2/\1|") | aplay -q; }
    matthewbauer · 2010-03-12 17:44:16 6
  • Merge files, joining line by line horizontally. Very useful when you have a lot of files where each line represents an info about an event and you want to join them into a single file where each line has all the info about the same event See the example for a better understanding Show Sample Output


    3
    paste file1 file2 fileN > merged
    polaco · 2010-03-12 16:34:48 6
  • Looks up a word on merriam-webster.com, does a screen scrape for the FIRST audio pronunciation and plays it. USAGE: Put this one-liner into a shell script (e.g., ~/bin/pronounce) and run it from the command line giving it the word to say: pronounce lek If the word isn't found in merriam-webster, no audio is played and the script returns an error value. However, M-W is a fairly complete dictionary (better than howjsay.com which won't let you hear how to pronounce naughty words). ASSUMPTIONS: GNU's sed (which supports -r for extended regular expressions) and Linux's aplay. Aplay can be replaced by any program that can play .WAV files from stdin. KNOWN BUGS: only the FIRST pronunciation is played, which is problematic if you wanted a particular form (plural, adjectival, etc) of the word. For example, if you run this: pronounce onomatopoetic you'll hear a voice saying "onomatopoeia". Playing the correct form of the word is possible, but doing so might make the screen scraper even more fragile than it already is. (The slightest change to the format of m-w.com could break it). Show Sample Output


    3
    cmd=$(wget -qO- "http://www.m-w.com/dictionary/$(echo "$@"|tr '[A-Z]' '[a-z]')" | sed -rn "s#return au\('([^']+?)', '([^'])[^']*'\);.*#\nwget -qO- http://cougar.eb.com/soundc11/\2/\1 | aplay -q#; s/[^\n]*\n//p"); [ "$cmd" ] && eval "$cmd" || exit 1
    hackerb9 · 2010-03-12 13:56:41 3
  • AIX user administration whithout smitty


    0
    chsec -f /etc/security/lastlog -a "unsuccessful_login_count=0" -s 'aix user'
    snaguber · 2010-03-12 09:28:36 9
  • There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100. OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 )) This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript). An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command. find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p' I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this: cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this. cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' If you were building a process-killer (very common for cheap hosting) this would also be handy. Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs. Show Sample Output


    1
    echo $(( `ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l` ))
    AskApache · 2010-03-12 08:42:49 6
  • It is helpful to know the current limits placed on your account, and using this shortcut is a quick way to figuring out which values to change for optimization or security. Alias is: alias ulimith="command ulimit -a|sed 's/^.*\([a-z]\))\(.*\)$/-\1\2/;s/^/ulimit /'|tr '\n' ' ';echo" Here's the result of this command: ulimit -c 0 -d unlimited -e 0 -f unlimited -i 155648 -l 32 -m unlimited -n 8192 -p 8 -q 819200 -r 0 -s 10240 -t unlimited -u unlimited -v unlimited -x unlimited ulimit -a core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 155648 max locked memory (kbytes, -l) 32 max memory size (kbytes, -m) unlimited open files (-n) 8192 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) unlimited virtual memory (kbytes, -v) unlimited file locks (-x) unlimited Show Sample Output


    3
    echo "ulimit `ulimit -a|sed -e 's/^.*\([a-z]\))\(.*\)$/-\1\2/'|tr "\n" ' '`"
    AskApache · 2010-03-12 06:46:54 4
  • traverses e.g. "/data/myhost1.com/myrsyncshare"; logs stderr and stdout. useful with cron.


    0
    for host in *; do { if [ -d $host ]; then { cd ${host}; for share in *; do { if [ -d $share ]; then { cd $share; rsync -av --delete rsyncuser@$host::$share . 2>../$share.err 1>../$share.log; cd ..; }; fi; }; done; cd ..; }; fi; }; done;
    c3w · 2010-03-11 19:54:31 3

  • 0
    function setTerm() { PROFILE=${1}; echo "tell app \"Terminal\" to set current settings of first window to settings set \"${PROFILE}\""|osascript; };
    c3w · 2010-03-11 17:35:26 10

  • 0
    pattern='regexp_pattern'; find . -type f -perm +220 ! -name '*.bak' -print0 | xargs -0 egrep -lZ $pattern | xargs -0 sed -i.bak -e "/$pattern/d"
    yblusseau · 2010-03-11 13:10:15 2

  • 2
    grep current_state= /var/log/nagios/status.dat|sort|uniq -c|sed -e "s/[\t ]*\([0-9]*\).*current_state=\([0-9]*\)/\2:\1/"|tr "\n" " "
    c3w · 2010-03-11 06:04:14 3
  • Find C/C++ source files and headers in the current directory. Show Sample Output


    2
    find . -name '*.[c|h]pp' -o -name '*.[ch]' -type f
    lucasrangit · 2010-03-11 01:22:06 16
  • This makes an alias for a command named 'busy'. The 'busy' command opens a random file in /usr/include to a random line with vim. Drop this in your .bash_aliases and make sure that file is initialized in your .bashrc.


    23
    alias busy='my_file=$(find /usr/include -type f | sort -R | head -n 1); my_len=$(wc -l $my_file | awk "{print $1}"); let "r = $RANDOM % $my_len" 2>/dev/null; vim +$r $my_file'
    busybee · 2010-03-09 21:48:41 19

  • 0
    git log -g --pretty=oneline | grep '}: commit' | awk '{print $1}' | head -1 | xargs git checkout -f
    jimthunderbird · 2010-03-09 16:56:39 3

  • 5
    git reflog show | grep '}: commit' | nl | sort -nr | nl | sort -nr | cut --fields=1,3 | sed s/commit://g | sed -e 's/HEAD*@{[0-9]*}://g'
    jimthunderbird · 2010-03-09 07:44:05 4
  • Purely frivolous - print a sine/cosine curve to the console - the width varies as it progresses. Ctrl-C to halt. Show Sample Output


    9
    ruby -e "i=0;loop{puts ' '*(29*(Math.sin(i)/2+1))+'|'*(29*(Math.cos(i)/2+1)); i+=0.1}"
    jaymcgavren · 2010-03-09 06:21:29 5
  • Use Ruby's standard Curses module to display a Lissajous curve in the console. Replace the "0.2" with different numbers for different curves. Show Sample Output


    2
    ruby -rcurses -e"include Curses;i=0;loop{setpos 12*(Math.sin(i)+1),40*(Math.cos(i*0.2)+1);addstr'.';i+=0.01;refresh}"
    jaymcgavren · 2010-03-09 06:10:47 13
  • for exemple : var="echo hello"; $var this will display "hello" with bash with zsh, by default, this will make an error : "command not found : echo hello" hwordsplit option permit zsh to act like bash on this point


    0
    setopt shwordsplit
    Gentux · 2010-03-08 20:52:55 4
  • make sure that flac and lame are installed sudo apt-get install lame flac


    2
    for file in *.flac; do $(flac -cd "$file" | lame -h - "${file%.flac}.mp3"); done
    schmiddim · 2010-03-08 13:37:25 9
  • The above command will open a Remote Desktop connection from command line, authenticate using default username and password (great for virtual machines; in the exampe above it's administrator:password), create a shared folder between your machine and the other machine and configure resolution to best fit your desktop (I don't like full screen because it make the desktop panels to disappear). The command will run in the background, and expect to receive parameters. You should enter hostname or IP address as a parameter to the command, and can also override the defaults parameters with your own.


    11
    rdesktop -a24 -uAdministrator -pPassword -r clipboard:CLIPBOARD -r disk:share=~/share -z -g 1280x900 -0 $@ &
    tomer · 2010-03-08 11:51:58 9
  • Dumps a compressed svn backup to a file, and emails the files along with any messages as the body of the email


    1
    (svnadmin dump /path/to/repo | gzip --best > /tmp/svn-backup.gz) 2>&1 | mutt -s "SVN backup `date +\%m/\%d/\%Y`" -a /tmp/svn-backup.gz emailaddress
    max · 2010-03-08 05:49:01 6
  • ‹ First  < 392 393 394 395 396 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

phpdoc shortcut
A shortcut to generate documentation with phpdoc. Defaults to HTML; optionally to PDF if third argument is given. Stores documentation in cwd under ./docs/. I forget the syntax to the output, -o, option, so this is easier.

add the result of a command into vi
':r!ls -l' results in listing the files in the current directory and paste it into vi

command shell generate random strong password
shell generate random strong password

Find default gateway (proper at ppp connections too)

Empty a file
The downside of output redirection is that you need permissions. So something like $ > file won't play nicely w/ sudo. You'd need to do something like $ bash -c '> file' instead, you could go w/ $ sudo truncate -s0 file

Android PNG screenshot
Works with *rooted* Android devices. 400x800 are the screen dimensions of a typical handheld smartphone.

Fastest segmented parallel sync of a remote directory over ssh
Mirror a remote directory using some tricks to maximize network speed. lftp:: coolest file transfer tool ever -u: username and password (pwd is merely a placeholder if you have ~/.ssh/id_rsa) -e: execute internal lftp commands set sftp:connect-program: use some specific command instead of plain ssh ssh:: -a -x -T: disable useless things -c arcfour: use the most efficient cipher specification -o Compression=no: disable compression to save CPU mirror: copy remote dir subtree to local dir -v: be verbose (cool progress bar and speed meter, one for each file in parallel) -c: continue interrupted file transfers if possible --loop: repeat mirror until no differences found --use-pget-n=3: transfer each file with 3 independent parallel TCP connections -P 2: transfer 2 files in parallel (totalling 6 TCP connections) sftp://remotehost:22: use sftp protocol on port 22 (you can give any other port if appropriate) You can play with values for --use-pget-n and/or -P to achieve maximum speed depending on the particular network. If the files are compressible removing "-o Compression=n" can be beneficial. Better create an alias for the command.

kill all processes using a directory/file/etc
This command will kill all processes using a directory. It's quick and dirty. One may also use a -9 with kill in case regular kill doesn't work. This is useful if one needs to umount a directory.

Create the four oauth keys required for a Twitter stream feed
Twitter stream feeds now require authentication. This command is the FIRST in a set of five commands you'll need to get Twitter authorization for your final Twitter command. *** IMPORTANT *** Before you start, you have to get some authorization info for your "app" from Twitter. Carefully follow the instructions below: Go to dev.twitter.com/apps and choose "Create a new application". Fill in the form. You can pick any name for your app. After submitting, click on "Create my access token". Keep the resulting page open, as you'll need information from it below. If you closed the page, or want to get back to it in the future, just go to dev.twitter.com/apps Now customize FIVE THINGS on the command line as follows: 1. Replace the string "Consumer key" by copying & pasting your custom consumer key from the Twitter apps page. 2. Replace the string "Consumer secret" by copying & pasting your consumer secret from the Twitter apps page. 3. Replace the string "Access token" by copying & pasting your access token from the Twitter apps page. 4. Replace string "Access token secret" by copying & pasting your own token secret from the Twitter apps page. 5. Replace the string 19258798 with the Twitter UserID NUMBER (this is **NOT** the normal Twitter NAME of the user you want the tweet feed from. If you don't know the UserID number, head over to www.idfromuser.com and type in the user's regular Twitter name. The site will return their Twitter UserID number to you. 19258798 is the Twitter UserID for commandlinefu, so if you don't change that, you'll receive commandlinefu tweets, uhm... on the commandline :) Congratulations! You're done creating all the keys! Environment variables k1, k2, k3, and k4 now hold the four Twitter keys you will need for your next step. The variables should really have been named better, e.g. "Consumer_key", but in later commands the 256-character limit forced me to use short, unclear names here. Just remember k stands for "key". Again, remember, you can always review your requested Twitter keys at dev.twitter.com/apps. Our command line also creates four additional environment variables that are needed in the oauth process: "once", "ts", "hmac" and "id". "once" is a random number used only once that is part of the oauth procedure. HMAC is the actual key that will be used later for signing the base string. "ts" is a timestamp in the Posix time format. The last variable (id) is the user id number of the Twitter user you want to get feeds from. Note that id is ***NOT*** the twitter name, if you didn't know that, see www.idfromuser.com If you want to learn more about oauth authentication, visit oauth.net and/or go to dev.twitter.com/apps, click on any of your apps and then click on "Oauth tool" Now go look at my next command, i.e. step2, to see what happens next to these eight variables.

Instantly load bash history of one shell into another running shell
By default bash history of a shell is appended (appended on Ubuntu by default: Look for 'shopt -s histappend' in ~/.bashrc) to history file only after that shell exits. Although after having written to the history file, other running shells do *not* inherit that history - only newly launched shells do. This pair of commands alleviate that.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: