Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged unix from sorted by
Terminal - Commands tagged unix - 59 results
utime { date -d @$1; }
2010-05-12 12:21:15
User: deltaray
Functions: date
4

More recent versions of the date command finally have the ability to decode the unix epoch time into a human readable date. This function makes it simple to utilize this feature quickly.

perl -i -pe 's/\r/\n/g' file
wget randomfunfacts.com -O - 2>/dev/null | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;" | while read FUNFACT; do notify-send -t $((1000+300*`echo -n $FUNFACT | wc -w`)) -i gtk-dialog-info "RandomFunFact" "$FUNFACT"; done
2010-04-02 09:43:32
User: mtron
Functions: grep read sed wc wget
2

extension to tali713's random fact generator. It takes the output & sends it to notify-osd. Display time is proportional to the lengh of the fact.

wget randomfunfacts.com -O - 2>/dev/null | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;"
2010-03-30 23:49:30
User: tali713
Functions: grep sed wget
13

Though without infinite time and knowledge of how the site will be designed in the future this may stop working, it still will serve as a simple straight forward starting point.

This uses the observation that the only item marked as strong on the page is the single logical line that includes the italicized fact.

If future revisions of the page show failure, or intermittent failure, one may simply alter the above to read.

wget randomfunfacts.com -O - 2>/dev/null | tee lastfact | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;"

The file lastfact, can then be examined whenever the command fails.

find ${PATH//:/ } -iname "*admin*" -executable -type f
2010-03-29 10:20:07
User: sanmiguel
Functions: find
Tags: bash find unix
1

While it seems (to me at least) a little counter-intuitive to filter on name first, this requires less work for find, as it allows it to immediately discount any files that do not match the name directly from the directory listing on disk. Querying against file attributes requires reading the file attributes, which is performed for all files matching any name based predicates.

trickle -d 60 wget http://very.big/file
2010-03-29 06:55:30
Functions: wget
7

Trickle is a voluntary, cooperative bandwidth shaper. it works entirely in userland and is very easy to use.

The most simple application is to limit the bandwidth usage of programs.

lgrep() { string=$1; file=$2; awk -v String=${string} '$0 ~ String' ${file}; }
2010-01-19 09:42:19
User: dopeman
Functions: awk
1

This is a handy way to circumvent the "Maximum line length of 2048 exceeded" grep error.

Once you have run the above command (or put it in your .bashrc), files can be searched using:

lgrep search-string /file/to/search
$rpl -R oldstring newstring folder
2009-12-09 03:15:31
User: johnraff
Tags: unix replace
3

If you can install rpl it's simpler to use and faster than combinations of find, grep and sed.

See man rpl for various options.

time on above operation: real 0m0.862s, user 0m0.548s, sys 0m0.180s

using find + sed: real 0m3.546s, user 0m1.752s, sys 0m1.580s

iconv -f utf8 -t utf16 /path/to/file
utime(){ perl -e "print localtime($1).\"\n\"";}
2009-11-06 12:58:10
User: MoHaG
Functions: perl
1

A shell function using perl to easily convert Unix-time to text.

Put in in your ~/.bashrc or equivalent.

Tested on Linux / Solaris Bourne, bash and zsh. using perl 5.6 and higher.

(Does not require GNU date like some other commands)

sudo dscl localhost -append /Local/Default/Groups/admin GroupMembership username
2009-09-03 04:40:10
User: kulor
Functions: sudo
0

adding users to groups on OS X is not a straightforward process, you need to use the new in built in Directory Service command line utility...

find ~/Library/Application\ Support/Firefox/ -type f -name "*.sqlite" -exec sqlite3 {} VACUUM \;
find ~/.mozilla/firefox/ -type f -name "*.sqlite" -exec sqlite3 {} VACUUM \;
find /backup/directory -name "FILENAME_*" -mtime +15 -exec rm -vf {};
rm -vf /backup/directory/**/FILENAME_*(m+15)
find /backup/directory -name "FILENAME_*" -mtime +15 | xargs rm -vf
pgrep -u `id -u` firefox-bin || find ~/.mozilla/firefox -name '*.sqlite'|(while read -e f; do echo 'vacuum;'|sqlite3 "$f" ; done)
2009-08-22 10:36:05
User: kamathln
Functions: echo find read
11

Sqlite database keeps collecting cruft as time passes, which can be cleaned by the 'vacuum;' command. This command cleans up the cruft in all sqlite files relating to the user you have logged in as. This command has to be run when firefox is not running, or it will exit displaying the pid of the firefox running.

:%s/^V^M//g
2009-08-19 11:59:22
User: slim
-1

Whereas ^V is CTRL-V.

converts a dos file to unix by removing 0x13 characters

watch -t -n1 "date +%T|figlet"
2009-06-21 01:02:37
User: dennisw
Functions: watch
43

This command displays a clock on your terminal which updates the time every second. Press Ctrl-C to exit.

A couple of variants:

A little bit bigger text:

watch -t -n1 "date +%T|figlet -f big"

You can try other figlet fonts, too.

Big sideways characters:

watch -n 1 -t '/usr/games/banner -w 30 $(date +%M:%S)'

This requires a particular version of banner and a 40-line terminal or you can adjust the width ("30" here).

while (( $i != 0 )) { smbstatus; sleep 5; clear }
2009-06-03 13:26:30
Functions: clear sleep
Tags: unix samba zsh
-4

See smbstatus Output within a 5 second interval (for monitoring smb access)

lynx -dump randomfunfacts.com | grep -A 3 U | sed 1D
2009-05-05 07:52:10
User: xizdaqrian
Functions: grep sed
0

This is a working version, though probably clumsy, of the script submitted by felix001. This works on ubuntu and CygWin. This would be great as a bash function, defined in .bashrc. Additionally it would work as a script put in the path.

sudo vi /etc/fstab; Go//smb-share/gino /mnt/place smbfs defaults,username=gino,password=pass 0 0<esc>:wq; mount //smb-share/gino
2009-04-02 16:04:35
User: GinoMan2440
Functions: mount sudo vi
4

the middle command between the ; and ; is the vi commands that insert that line into the last line of the file, the esc with the carets is literally hitting the escape key, you have to have the smbfs package installed to do it, I use it to access my iTunes music on my mac from my linux PC's with amarok so I can play the music anywhere in the house. among other things, it allows you to access the files on that share from your computer anytime you're on that network.

for i in $(seq 1 11) 13 14 15 16; do man iso-8859-$i; done
2009-03-31 19:40:15
User: penpen
Functions: man seq
Tags: Linux unix
-2

Depending on the installation only certain of these man pages are installed. 12 is left out on purpose because ISO/IEC 8859-12 does not exist. To also access those manpages that are not installed use opera (or any other browser that supports all the character sets involved) to display online versions of the manpages hosted at kernel.org:

for i in $(seq 1 11) 13 14 15 16; do opera http://www.kernel.org/doc/man-pages/online/pages/man7/iso_8859-$i.7.html; done
wget --server-response --spider http://www.example.com/
2009-03-31 18:49:14
User: penpen
Functions: wget
4

Let me suggest using wget for obtaining the HTTP header only as the last resort because it generates considerable textual overhead. The first ellipsis of the sample output stands for

Spider mode enabled. Check if remote file exists.

--2009-03-31 20:42:46-- http://www.example.com/

Resolving www.example.com... 208.77.188.166

Connecting to www.example.com|208.77.188.166|:80... connected.

HTTP request sent, awaiting response...

and the second one looks for

Length: 438 [text/html]

Remote file exists and could contain further links,

but recursion is disabled -- not retrieving.

lynx -dump -head http://www.example.com/
2009-03-31 18:41:36
User: penpen
-1

Without the -dump option the header is displayed in lynx. You can also use w3m, the command then is

w3m -dump_head http://www.example.com/