Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using cat from sorted by
Terminal - Commands using cat - 428 results
cat frame/*.mpeg | ffmpeg -i $ID.mp3 -i - -f dvd -y track/$ID.mpg 2>/dev/null
2009-08-04 06:31:50
User: pamirian
Functions: cat
5

This is an extract from a larger script which formats the video for DVD. The videos I use have no audio track so I need to add one. Tweak as you like...

find . -type f -name "*.c" -exec cat {} \; | wc -l
2009-07-30 10:06:51
User: foremire
Functions: cat find wc
1

use find to grep all .c files from the target directory, cat them into one stream, then piped to wc to count the lines

cat file | tee >> file
2009-07-30 07:34:03
User: GeckoDH
Functions: cat file tee
0

The command `cat file >> file` failes with the following error message:

cat: file: input file is output file

`tee` is a nice workaround without using any temporary files.

cat /proc/net/ip_conntrack | grep ESTABLISHED | grep -c -v ^#
for file in *.001; do NAME=`echo $file | cut -d. -f1,2`; cat "$NAME."[0-9][0-9][0-9] > "$NAME"; done
2009-07-29 10:04:26
User: jaymzcd
Functions: cat cut file
2

If you use newsgroups then you'll have come across split files before. Joining together a whole batch of them can be a pain so this will do the whole folder in one.

buffer () { tty -s && return; tmp=$(mktemp); cat > "${tmp}"; if [ -n "$1" ] && ( ( [ -f "$1" ] && [ -w "$1" ] ) || ( ! [ -a "$1" ] && [ -w "$(dirname "$1")" ] ) ); then mv -f "${tmp}" "$1"; else echo "Can't write in \"$1\""; rm -f "${tmp}"; fi }
2009-07-27 20:21:15
User: Josay
Functions: cat echo mv rm tty
Tags: redirection
2

A common mistake in Bash is to write command-line where there's command a reading a file and whose result is redirected to that file.

It can be easily avoided because of :

1) warnings "-bash: file.txt: cannot overwrite existing file"

2) options (often "-i") that let the command directly modify the file

but I like to have that small function that does the trick by waiting for the first command to end before trying to write into the file.

Lots of things could probably done in a better way, if you know one...

cat 1.mp3 2.mp3 > combined.mp3
2009-07-27 18:39:44
User: scottix
Functions: cat
1

This just combines multiple mp3's into one mp3 file. Basically it is a easy join for mp3's

man -P cat ls > man_ls.txt
2009-07-27 13:09:24
User: alvinx
Functions: cat ls man
0

Output manpage as plaintext using cat as pager: man -P cat commandname

And redirect its stdout into a file: man -P cat commandname > textfile.txt

Example: man -P cat ls > man_ls.txt

cat /var/log/secure.log | awk '{print substr($0,0,12)}' | uniq -c | sort -nr | awk '{printf("\n%s ",$0) ; for (i = 0; i<$1 ; i++) {printf("*")};}'
2009-07-24 07:20:06
User: knassery
Functions: awk cat sort uniq
15

Busiest seconds:

cat /var/log/secure.log | awk '{print substr($0,0,15)}' | uniq -c | sort -nr | awk '{printf("\n%s ",$0) ; for (i = 0; i<$1 ; i++) {printf("*")};}'
cat `whereis mysqlbug | awk '{print $2}'` | grep 'CONFIGURE_LINE='
find . -name '*.html' -print0| xargs -0 -L1 cat |sed "s/[\"\<\>' \t\(\);]/\n/g" |grep "http://" |sort -u
2009-07-14 07:00:15
User: jamespitt
Functions: cat find grep sed sort xargs
4

Just a handy way to get all the unique links from inside all the html files inside a directory. Can be handy on scripts etc.

cat /dev/urandom|awk 'BEGIN{"tput cuu1" | getline CursorUp; "tput clear" | getline Clear; printf Clear}{num+=1;printf CursorUp; print num}'
2009-07-13 07:30:51
User: axelabs
Functions: awk cat printf
Tags: nawk awk clear tput
0

awk can clear the screen while displaying output. This is a handy way of seeing how many lines a tail -f has hit or see how many files find has found. On solaris, you may have to use 'nawk' and your machine needs 'tput'

infile=$1 for i in $(cat $infile) do echo $i | tr "," "\n" | sort -n | tr "\n" "," | sed "s/,$//" echo done
2009-07-12 21:23:37
User: iframe
Functions: cat echo sed sort tr
Tags: cat bash sort sed tr
0

Save the script as: sort_file

Usage: sort_file < sort_me.csv > out_file.csv

This script was originally posted by Admiral Beotch in LinuxQuestions.org on the Linux-Software forum.

I modified this script to make it more portable.

cat /dev/clipboard; $(somecommand) > /dev/clipboard
2009-07-10 18:48:21
User: sud0er
Functions: cat
Tags: windows cygwin
12

I spent a bunch of time yesterday looking for the xsel package in Cygwin- turns out you can use the /dev/clipboard device to do the same thing.

cat <<.>> somefilename
2009-07-10 17:45:42
User: tomlouie
Functions: cat
Tags: text
4

If you just want to write or append some text to a file without having to run a text editor, run this command. After running the command, start typing away. To exit, type . on a line by itself.

Replacing the >> with a single > will let you overwrite your file.

cat large.xml | xclip
2009-07-08 16:30:07
User: copremesis
Functions: cat
0

avoid mouse abuse and the constant struggle of balancing scroll velocity ... not to mention that burning sensation in your upper right shoulder ....

test `uname` = Linux && lsb_release -a || ( test `uname` = SunOS && cat /etc/release || uname -rms )
2009-07-07 20:51:30
User: virtualshock
Functions: cat test uname
-7

Found in comments section works on most Linux flavors.

cat /var/log/auth.log | logtool -o HTML > auth.html
2009-07-03 18:17:22
Functions: cat
3

Logtool is a nice tool that can export log file to various format, but its strength lies in the capacity of colorize logs. This command take a log as input and colorize it, then export it to an html file for a more confortable view. Logtool is part of logtool package.Tested on Debian.

cat myfile.txt | tr -d '\n'
pdftk $* cat output $merged.pdf
find . -type f -name *.ext -exec cat {} > file.txt \;
2009-06-17 11:33:14
User: realgt
Functions: cat find
2

Useful if you have to put together multiple files into one and they are scattered across subdirectories. For example: You need to combine all .sql files into one .sql file that would be sent to DBAs as a batch script.

You do get a warning if you create a file by the same extension as the ones your searching for.

find . -type f -name *.sql -exec cat {} > BatchFile.txt \;

mysql -uadmin -p` cat /etc/psa/.psa.shadow` -Dpsa -e"select mail_name,name,password from mail left join domains on mail.dom_id = domains.id inner join accounts where mail.account_id = accounts.id;"
function my_irc { tmp=`mktemp`; cat > $tmp; { echo -e "USER $username x x :$ircname\nNICK $nick\nJOIN $target"; while read line; do echo -e "PRIVMSG $target :$line"; done < $tmp; } | nc $server > /dev/null ; rm $tmp; }
2009-06-11 22:14:48
User: Josay
Functions: cat echo read rm
Tags: netcat irc nc
1
command | my_irc

Pipe whatever you want to this function, it will, if everything goes well, be redirected to a channel or a user on an IRC server.

Please note that :

- I am not responsible of flood excesses you might provoke.

- that function does not reply to PINGs from the server. That's the reason why I first write in a temporary file. Indeed, I don't want to wait for inputs while being connected to the server. However, according to the configuration of the server and the length of your file, you may timeout before finishing.

- Concerning the server, the variable content must be on the form "irc.server.org 6667" (or any other port). If you want to make some tests, you can also create a fake IRC server on "localhost 55555" by using

netcat -l -p 55555

- Concerning the target, you can choose a channel (beginning with a '#' like "#chan") or a user (like "user")

- The other variables have obvious names.

ssh $HOST -l$USER cat /REMOTE/FILE | sdiff /LOCAL/FILE -
cat somefile.css | awk '{gsub(/{|}|;/,"&\n"); print}' >> uncompressed.css
2009-06-02 15:51:51
User: lrvick
Functions: awk cat
0

Ever compress a file for the web by replacing all newline characters with nothing so it makes one nice big blob?

It is a great idea, however what about when you want to edit that file? ...Serious pain in the butt.

I ran into this today in that my only copy of a CSS file was "compressed" with no newlines.

I whipped this up and it converted back into nice human readable CSS :-)

It could be nicer, but it does the job.