Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Hide

Credits

Commands using tr from sorted by
Terminal - Commands using tr - 279 results
chkconfig --list | fgrep :on | sed -e 's/\(^.*\)*0:off/\1:/g' -e 's/\(.\):on/\1/g' -e 's/.:off//g' | tr -d [:blank:] | awk -F: '{print$2,$1}' | ssh host 'cat > foo'
2009-05-13 21:17:39
User: catawampus
3

And then to complete the task:

Go to target host;

ssh host

Turn everything off:

for i in `chkconfig --list | fgrep :on | awk '{print $1}'` ; do chkconfig --level 12345 $i off; done

Create duplicate config:

while read line; do chkconfig --level $line on; done < foo
find / \( -name "*.log" -o -name "*.mylogs" \) -exec ls -lrt {} \; | sort -k6,8 | head -n1 | cut -d" " -f8- | tr -d '\n' | xargs -0 rm
2009-05-10 10:45:48
User: ghazz
Functions: cut find head ls sort tr xargs
1

This works on my ubuntu/debian machines.

I suspect other distros need some tweaking of sort and cut.

I am sure someone could provide a shorter/faster version.

wget -q -O- http://www.gutenberg.org/dirs/etext96/cprfd10.txt | sed '1,419d' | tr "\n" " " | tr " " "\n" | perl -lpe 's/\W//g;$_=lc($_)' | grep "^[a-z]" | awk 'length > 1' | sort | uniq -c | awk '{print $2"\t"$1}'
2009-05-04 16:00:39
User: alperyilmaz
Functions: awk grep perl sed sort tr uniq wget
-4

This command might not be useful for most of us, I just wanted to share it to show power of command line.

Download simple text version of novel David Copperfield from Poject Gutenberg and then generate a single column of words after which occurences of each word is counted by sort | uniq -c combination.

This command removes numbers and single characters from count. I'm sure you can write a shorter version.

dpkg-query -l| grep -v "ii " | grep "rc " | awk '{print $2" "}' | tr -d "\n" | xargs aptitude purge -y
2009-04-28 19:25:53
User: thepicard
Functions: awk grep tr xargs
-3

This will, for an application that has already been removed but had its configuration left behind, purge that configuration from the system. To test it out first, you can remove the last -y, and it will show you what it will purge without actually doing it. I mean it never hurts to check first, "just in case." ;)

echo "12345,12346" |sed -e's/ //'|tr "," "\n"| while read line; do echo -n $line" "; svn log -vr $line https://url/to/svn/repository/|grep "/"|head -1|cut -d"/" -f2; done
shuf -n1 /usr/share/dict/words | tee >(sed -e 's/./&\n/g' | shuf | tr -d '\n' | line) > /tmp/out
< /dev/urandom tr -dc A-Za-z0-9_ | head -c $((1024 * 1024)) | tee >(gzip -c > out.gz) >(bzip2 -c > out.bz) > /dev/null
date +%m/%d/%y%X|tr -d 'n' >>datemp.log&& sensors|grep +5V|cut -d "(" -f1|tr -d 'n'>> datemp.log && sensors |grep Temp |cut -d "(" -f1|tr -d 'n'>>datemp.log
2009-03-31 18:13:23
User: f241vc15
Functions: cut date grep sensors tr
0
cat datemp.log

04/01/0902:11:42

Sys Temp: +11.0?C

CPU Temp: +35.5?C

AUX Temp: +3.0?C

function headers { head -1 $* | tr ',' '\12' | pr -t -n ; }
2009-03-25 20:07:47
User: totoro
Functions: head pr tr
Tags: CSV headers
0

This little command (function) shows the CSV header fields (which are field names separated by commas) as an ordered list, clearly showing the fields and their order.

ls -1 static/images/ | while read line; do echo -n $line' '[; grep -rc $line *|grep -v ".svn"|cut -d":" -f2|grep -vc 0| tr "\n" -d; echo -n ]; echo ; done
2009-03-20 20:33:36
User: psytek
Functions: cut echo grep ls read tr
-5

This command will grep the entire directory looking for any files containing the list of files. This is useful for cleaning out your project of old static files that are no longer in use. Also ignores .svn directories for accurate counts. Replace 'static/images/' with the directory containing the files you want to search for.

alias path='echo $PATH | tr ":" "\n"'
2009-03-12 17:07:58
User: voyeg3r
Functions: alias tr
-1

change ":" in path for new line and associate word path to var $PATH

vlc --one-instance --playlist-enqueue -q $(while read netcast; do wget -q $netcast -O - |grep enclosure | tr '\r' '\n' | tr \' \" | sed -n 's/.*url="\([^"]*\)".*/\1/p'|head -n1; done <netcast.txt)
2009-03-03 04:26:01
User: tomwsmf
Functions: read sed tr wget
1

This is a quick line to stream in the latest offerings of your favorite netcasts/podcasts. You will need to have a file named netcast.txt in the directory you run this from. This file should have one and only one of your netcast's/podcst's url per line.

When run the line grabs the offering on the top of the netcast/podcast stack and end it over , quietly, to vlc.

Since I move around computers during the day I wanted an easy way to listen to my daily dose of news and such without having to worry about downloading to whatever machine I am on. This is just a quick grab and stream of whats current.

Future plans... have the list of netcasts be read from the web. possibly an rss or such. I use greader so there might be a way to use it as the source so as not to have to muck with multiple lists

on="off"; off="on"; now=$(amixer get Master | tr -d '[]' | grep "Playback.*%" |head -n1 |awk '{print $7}'); amixer sset Master ${!now}
tr -dc "a-zA-Z0-9-_\$\?" < /dev/urandom | head -c 10 | gpg -e -r medha@nerdish.de > password.gpg
2009-02-25 08:48:26
User: hans
Functions: gpg head tr
2

Adjust the

head -c

part for password length.

I use filenames like "hans@commandlinefu.com.gpg" and a vim which automatically decrypts files with .gpg suffixes.

< /dev/urandom tr -dc _A-Z-a-z-0-9 | head -c6
2009-02-24 09:43:40
User: Blackbit
Functions: head tr
11

If you want a password length longer than 6, changing the -c6 to read -c8 will give you 8 random characters instead of 6. To end up with a line-feed, use this with echo:

# echo `< /dev/urandom tr -dc _A-Z-a-z-0-9 | head -c6`

zcat a_big_file.gz | sed -ne "$(zcat a_big_file.gz | tr -d "[:print:]" | cat -n | grep -vP "^ *\d+\t$" | cut -f 1 | sed -e "s/\([0-9]\+\)/\1=;\1p;/" | xargs)" | tr -c "[:print:]\n" "?"
2009-02-24 02:57:37
User: DEinspanjer
Functions: sed tr zcat
1

Scans the file once to build a list of line numbers that contain non-printable characters

Scans the file again, passing those line numbers to sed as two commands to print the line number and the line itself. Also passes the output through a tr to replace the characters with a ?

echo -en "stats\r\n" "quit\r\n" | nc localhost 11211 | tr -s [:cntrl:] " "| cut -f42,48 -d" " | sed "s/\([0-9]*\)\s\([0-9]*\)/ \2\/\1*100/" | bc -l
sed 's/[ \t]*$//' < emails.txt | tr 'A-Z' 'a-z' | sort | uniq > emails_sorted.txt
find . -type d -execdir du -sh '{}' ';' | grep -E "[0-9]+K" | sed 's/^[0-9\.]\+K[\t ]\+//' | tr "\n" "\0" | xargs -0 rm -rf
tr -d "\n" < file1 > file2
strings /dev/urandom | grep -o '[[:alnum:]]' | head -n 30 | tr -d '\n'; echo
2009-02-16 00:39:28
User: jbcurtis
Functions: grep head strings tr
44

Find random strings within /dev/urandom. Using grep filter to just Alphanumeric characters, and then print the first 30 and remove all the line feeds.

echo $PATH | tr : \\n
2009-02-13 15:54:58
User: piyo
Functions: echo tr
2

This works in bash and zsh.

You may also want to alias it, if you need to look at it often...

alias lpath="echo \$PATH | tr : \\\\n"

"\$PATH" to make sure to look at your current $PATH

echo 16i `echo "F" | tr '[a-f]' '[A-F]'` p | dc ; echo 16o "15" p | dc
echo "Xc" | tr "Xo" "\033\017
mount | grep : | tr -s ' ' -d 3 | xargs umount -v