Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands using sed from sorted by
Terminal - Commands using sed - 1,152 results
curl --silent search.twitter.com | sed -n '/div id=\"hot\"/,/div/p' | awk -F\> '{print $2}' | awk -F\< '{print $1}' | sed '/^$/d'
/sbin/ip -f inet addr | sed -rn 's/.*inet ([^ ]+).*(eth[[:digit:]]*(:[[:digit:]]+)?)/\2 \1/p' | column -t
find ./wp-content/themes/rotce2009/ -name '*.php' -type f | xargs sed -i 's/<? /<?php /g'
echo $LS_COLORS | sed 's/:/\n/g' | awk -F= '!/^$/{printf("%s \x1b[%smdemo\x1b[0m\n",$0,$2)}'
2009-12-15 01:17:46
User: bones7456
Functions: awk echo sed
8

This can show all ls colors, with a demo.

perl -lne 'print for /url":"\K[^"]+/g' $(ls -t ~/.mozilla/firefox/*/sessionstore.js | sed q)
2009-12-14 00:51:54
User: sputnick
Functions: ls perl sed
0

If you want all the URLs from all the sessions, you can use :

perl -lne 'print for /url":"\K[^"]+/g' ~/.mozilla/firefox/*/sessionstore.js

Thanks to tybalt89 ( idea of the "for" statement ).

For perl purists, there's JSON and File::Slurp modules, buts that's not installed by default.

sed = <file> | sed 'N;s/\n/\t/'
2009-12-11 14:39:14
User: jgc
Functions: sed
Tags: sed
-1

Print out contents of file with line numbers.

This version will print a number for every line, and separates the numbering from the line with a tab.

sed '/./=' infile | sed '/^/N; s/\n/ /'
2009-12-10 16:24:56
User: glaudiston
Functions: sed
-1

There's too many options to number,

My curiosity has forced me to make it using only sed.

Maybe useful... or not... :-S

unzip -p some.docx word/document.xml | sed -e 's/<[^>]\{1,\}>//g; s/[^[:print:]]\{1,\}//g'
sed -e 's/{"url":/\n&/g' ~/.mozilla/firefox/*/sessionstore.js | cut -d\" -f4
grep -oP '"url":"\K[^"]+' $(ls -t ~/.mozilla/firefox/*/sessionstore.js | sed q)
2009-12-09 20:34:32
User: sputnick
Functions: grep ls sed
0

Require "grep -P" ( pcre ).

If you don't have grep -P, use that :

grep -Eo '"url":"[^"]+' $(ls -t ~/.mozilla/firefox/*/sessionstore.js | sed q) | cut -d'"' -f4
wget -q http://dynamic.xkcd.com/comic/random/ -O-| sed -n '/<img src="http:\/\/imgs.xkcd.com\/comics/{s/.*\(http:.*\)" t.*/\1/;p}' | awk '{system ("wget -q " $1 " -O- | display -title $(basename " $1") -write /tmp/$(basename " $1")");}'
2009-12-09 13:41:25
User: laugg
Functions: awk sed wget
Tags: sed awk wget comic
0

Only need to install Image Magick package.

Display a xkcd comic with its title and save it in /tmp directory

If you prefer to view the newest xkcd, use this command:

wget -q http://xkcd.com/ -O-| sed -n '/<img src="http:\/\/imgs.xkcd.com\/comics/{s/.*\(http:.*\)" t.*/\1/;p}' | awk '{system ("wget -q " $1 " -O- | display -title $(basename " $1") -write /tmp/$(basename " $1")");}'
find . -type f -exec sed -i s/oldstring/newstring/g {} +
2009-12-09 00:46:13
User: SlimG
Functions: find sed
Tags: sed find
15

This command find all files in the current dir and subdirs, and replace all occurances of "oldstring" in every file with "newstring".

grep -Ri searchterm ~/.purple/logs/* | sed -e 's/<.*?>//g'
2009-12-07 19:38:18
User: Nostoc
Functions: grep sed
Tags: pidgin logs
2

will search trought pidgin conversation logs for "searchterm", and output them stripping the html tags. The "sed" command is optionnal if your logs are stored in plain text format.

git clean -n | sed 's/Would remove //; /Would not remove/d;' | xargs mv -t stuff/
ipcalc $(ifconfig eth0 | grep "inet addr:" | cut -d':' -f2,4 | sed 's/.+Bcast:/\//g') | awk '/Network/ { print $2 } '
wget -O - -q http://www.azlyrics.com/lyrics/abba/takeachanceonme.html | sed -e 's/[cC]hance/dump/g' > ~/tdom.htm && firefox ~/tdom.htm
2009-12-04 22:56:00
User: tighe
Functions: sed wget
0

ABBA would be more entertaining if they sang this.

tr -dc 'a-zA-Z0-9' < /dev/urandom | fold -w 10 | sed 1q
cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 10 | sed 1q
cat ~/.viminfo | sed -n '/^:[0-9]\+,\([0-9]\+\|\$\)s/p'
2009-11-29 01:54:57
User: jyf
Functions: cat sed
1

i want to count how many regex code i have used in vim in a long time

so i make a directory in svn host and post record to this directory

of course i dont want to post manually so i worte a script to do that

and this is the core thing to do

egrep "<link>|<title>" recenttracks.rss | awk 'ORS=NR%2?" ":"\n"' | awk -F "</title>" '{print $2, $1}' | sed -e 's/\<link\>/\<li\>\<a href\=\"/' -e 's/\<\/link\>/\">/' -e 's/\<title\>//' -e 's/$/\<\/a\>\<\/li\>/g' -e '1,1d' -e 's/^[ \t]*//'
2009-11-28 13:19:05
User: HerbT
Functions: awk egrep sed
3

Quick and kludgy rss parser for the recent tracks rss feed from last.fm. Extracts artist and track link.

printf $(echo -n $1 | sed 's/\\/\\\\/g;s/\(%\)\([0-9a-fA-F][0-9a-fA-F]\)/\\x\2/g')
2009-11-25 04:27:39
User: infinull
Functions: echo printf sed
2

My version uses printf and command substitution ($()) instead of echo -e and xargs, this is a few less chars, but not real substantive difference.

Also supports lowercase hex letters and a backslash (\) will make it through unescaped

ls -F | sed -n 's/@$//p'
awk '/q=/{print $11}' /var/log/httpd/access_log.4 | awk -F 'q=' '{print $2}' | sed 's/+/ /g;s/%22/"/g;s/q=//' | cut -d "&" -f 1
cat /var/log/httpd/access_log | grep q= | awk '{print $11}' | awk -F 'q=' '{print $2}' | sed 's/+/ /g;s/%22/"/g;s/q=//' | cut -d "&" -f 1 | mail [email protected] -s "[your-site] search strings for `date`"
2009-11-22 03:03:06
User: isma
Functions: awk cat grep sed strings
-2

It's not a big line, and it *may not* work for everybody, I guess it depends on the detail of access_log configuration in your httpd.conf. I use it as a prerotate command for logrotate in httpd section so it executes before access_log rotation, everyday at midnight.

for f in $(ls *.xml.skippy); do mv $f `echo $f | sed 's|.skippy||'`; done
2009-11-19 21:36:26
User: argherna
Functions: ls mv sed
Tags: sed ls mv for
-2

For this example, all files in the current directory that end in '.xml.skippy' will have the '.skippy' removed from their names.