Commands using sed (1,319)


  • 4
    sed '1!G;h;$!d'
    grep · 2009-02-16 21:05:54 24
  • If run in bash, this will display all executables that are in your current $PATH Show Sample Output


    4
    ls `echo $PATH | sed 's/:/ /g'`
    archlich · 2009-03-09 19:01:41 8
  • Some commands (such as netcat) have a port option but how can you know which ports are unused? Show Sample Output


    4
    netstat -atn | awk ' /tcp/ {printf("%s\n",substr($4,index($4,":")+1,length($4) )) }' | sed -e "s/://g" | sort -rnu | awk '{array [$1] = $1} END {i=32768; again=1; while (again == 1) {if (array[i] == i) {i=i+1} else {print i; again=0}}}'
    mpb · 2009-03-27 20:38:43 8
  • url can be a working copy or url to a svn repository, revision is any valid revision number for that branch. Show Sample Output


    4
    svn log $url -r $revision -v | egrep " [RAMD] \/" | sed s/^.....//
    nitehawk · 2009-04-27 19:50:06 8
  • will show: installed linux headers, image, or modules: /^ii/!d avoiding current kernel: /'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d only application names: s/^[^ ]* [^ ]* \([^ ]*\).*/\1/ avoiding stuff without a version number: /[0-9]/!d Show Sample Output


    4
    dpkg -l 'linux-*' | sed '/^ii/!d;/'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d;s/^[^ ]* [^ ]* \([^ ]*\).*/\1/;/[0-9]/!d'
    plasticdoc · 2009-06-19 10:23:38 8
  • Just a handy way to get all the unique links from inside all the html files inside a directory. Can be handy on scripts etc. Show Sample Output


    4
    find . -name '*.html' -print0| xargs -0 -L1 cat |sed "s/[\"\<\>' \t\(\);]/\n/g" |grep "http://" |sort -u
    jamespitt · 2009-07-14 07:00:15 10
  • Limited, but useful construct to extract text embedded in XML tags. This will only work if bar is all on one line. If nobody posts an alternative for the multiline sed version, I'll figure it out later...


    4
    sed -n 's/.*<foo>\([^<]*\)<\/foo>.*/\1/p'
    recursiverse · 2009-07-23 07:59:30 3
  • Display the amount of memory used by all the httpd processes. Great in case you are being Slashdoted!


    4
    ps -o rss -C httpd | tail -n +2 | (sed 's/^/x+=/'; echo x) | bc
    ricardoarguello · 2009-07-31 15:15:08 9
  • I use this as an alias: alias authplain "printf '\!:1\0\!:1\0\!:2' | mmencode | tr -d '\n' | sed 's/^/AUTH PLAIN /'" then.. # authplain someuser@somedomain.com secretpassword AUTH PLAIN c29tZXVzZXJAc29tZWRvbWFpbi5jb20Ac29tZXVzZXJAc29tZWRvbWFpbi5jb20Ac2VjcmV0cGFzc3dvcmQ= # Show Sample Output


    4
    printf '\!:1\0\!:1\0\!:2' | mmencode | tr -d '\n' | sed 's/^/AUTH PLAIN /'
    vwal · 2009-08-04 05:04:50 4
  • Will return your internal IP address. Show Sample Output


    4
    ifconfig $devices | grep "inet addr" | sed 's/.*inet addr:\([0-9\.]*\).*/\1/g'
    matthewbauer · 2009-08-06 21:43:22 3
  • In Bash, when defining an alias, one usually loses the completion related to the function used in that alias (that completion is usually defined in /etc/bash_completion using the complete builtin). It's easy to reuse the work done for that completion in order to have smart completion for our alias. That's what is done by this command line (that's only an example but it may be very easy to reuse). Note 1 : You can use given command line in a loop "for old in apt-get apt-cache" if you want to define aliases like that for many commands. Note 2 : You can put the output of the command directly in your .bashrc file (after the ". /etc/bash_completion") to always have the alias and its completion Show Sample Output


    4
    old='apt-get'; new="su-${old}"; command="sudo ${old}"; alias "${new}=${command}"; $( complete | sed -n "s/${old}$/${new}/p" ); alias ${new}; complete -p ${new}
    Josay · 2009-08-10 00:15:05 4
  • The coolest way I've found to backup a wordpress mysql database using encryption, and using local variables created directly from the wp-config.php file so that you don't have to type them- which would allow someone sniffing your terminal or viewing your shell history to see your info. I use a variation of this for my servers that have hundreds of wordpress installs and databases by using a find command for the wp-config.php file and passing that through xargs to my function. Show Sample Output


    4
    eval $(sed -n "s/^d[^D]*DB_\([NUPH]\)[ASO].*',[^']*'\([^']*\)'.*/_\1='\2'/p" wp-config.php) && mysqldump --opt --add-drop-table -u$_U -p$_P -h$_H $_N | gpg -er AskApache >`date +%m%d%y-%H%M.$_N.sqls`
    AskApache · 2009-08-18 07:03:08 7
  • Scrape the National Weather Service Show Sample Output


    4
    weather() { lynx -dump "http://mobile.weather.gov/port_zh.php?inputstring=$*" | sed 's/^ *//;/ror has occ/q;2h;/__/!{x;s/\n.*//;x;H;d};x;s/\n/ -- /;q';}
    zude · 2009-10-17 23:47:47 3

  • 4
    geoip(){curl -s "http://www.geody.com/geoip.php?ip=${1}" | sed '/^IP:/!d;s/<[^>][^>]*>//g' ;}
    twfcc · 2009-10-19 05:48:07 17

  • 4
    mpg123 `curl -s http://blip.fm/all | sed -e 's#"#\n#g' | grep mp3$ | xargs`
    torrid · 2009-11-07 14:48:01 7

  • 4
    curl --silent search.twitter.com | sed -n '/div id=\"hot\"/,/div/p' | awk -F\> '{print $2}' | awk -F\< '{print $1}' | sed '/^$/d'
    allrightname · 2009-12-21 21:29:34 3
  • xargs deals badly with special characters (such as space, ' and "). To see the problem try this: touch important_file touch 'not important_file' ls not* | xargs rm Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.


    4
    grep -rl oldstring . | parallel sed -i -e 's/oldstring/newstring/'
    unixmonkey8046 · 2010-01-28 08:44:16 6
  • Will track your mouse and save it to a file. You can use gnuplot to graph it: gnuplot -persist <(echo "unset key;unset border;unset yzeroaxis;unset xtics;unset ytics;unset ztics;plot './mouse-tracking' with points lt 1 pt 6 ps variable")


    4
    while true; do xdotool getmouselocation | sed 's/x:\(.*\) y:\(.*\) screen:.*/\1, \2/' >> ./mouse-tracking; sleep 10; done
    matthewbauer · 2010-02-27 04:00:13 5
  • Here "^M" is NOT "SHIFT+6" and "M". Type CTRL+V+M to get it instead. Its shortest and easy. And its sed!, which is available by default in all linux flavours.. no need to install extra tools like fromdos.


    4
    sed -i 's/^M//' file
    sata · 2010-03-25 19:34:08 7
  • Ever gone to a site that has an MP3 embedded into a pesky flash player, but no download link? Well, this one-liner will yank the names of those tunes straight out of FF's cache in a nice, easy to read list. What you do with them after that is *ahem* no concern of mine. ;) Show Sample Output


    4
    for i in `ls ~/.mozilla/firefox/*/Cache`; do file $i | grep -i mpeg | awk '{print $1}' | sed s/.$//; done
    BoxingOctopus · 2010-04-11 23:14:18 7

  • 4
    sed '/^$/d'
    er0k · 2010-04-18 00:52:00 5
  • in "a.html", find all images referred as relative URI in an HTML file by "src" attribute of "img" element, replace them with "data:" URI. This useful to create single HTML file holding all images in it, as a replacement of the IE-created .mht file format. The generated HTML works fine on every other browser except IE, as well as many HTML editors like kompozer, while the .mht format only works for IE, but not for every other browser. Compare to the KDE's own single-file-web-page format "war" format, which only opens correctly on KDE, the HTML file with "data:" URI is more universally supported. The above command have many bugs. My commandline-fu is too limited to fix them: 1. it assume all URLs are relative URIs, thus works in this case: <img src="images/logo.png"/> but does not work in this case: <img src="http://www.my_web_site.com/images/logo.png" /> This may not be a bug, as full URIs perhaps should be ignored in many use cases. 2. it only work for images whoes file name suffix is one of .jpg, .gif, .png, albeit images with .jpeg suffix and those without extension names at all are legal to HTML. 3. image file name is not allowed to contain "(" even though frequently used, as in "(copy of) my car.jpg". Besides, neither single nor double quotes are allowed. 4. There is infact a big flaw in this, file names are actually used as regular expression to be replaced with base64 encoded content. This cause the script to fail in many other cases. Example: 'D:\images\logo.png', where backward slash have different meaning in regular expression. I don't know how to fix this. I don't know any command that can do full text (no regular expression) replacement the way basic editors like gedit does. 5. The original a.html are not preserved, so a user should make a copy first in case things go wrong.


    4
    grep -ioE "(url\(|src=)['\"]?[^)'\"]*" a.html | grep -ioE "[^\"'(]*.(jpg|png|gif)" | while read l ; do sed -i "s>$l>data:image/${l/[^.]*./};base64,`openssl enc -base64 -in $l| tr -d '\n'`>" a.html ; done;
    zhangweiwu · 2010-05-05 14:07:51 13
  • This will remove all installed kernels on your debian based install, except the one you're currently using. From: http://tuxtweaks.com/2009/12/remove-old-kernels-in-ubuntu/comment-page-1/#comment-1590


    4
    dpkg -l 'linux-*' | sed '/^ii/!d;/'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d;s/^[^ ]* [^ ]* \([^ ]*\).*/\1/;/[0-9]/!d' | xargs sudo apt-get -y purge
    mitzip · 2010-06-10 20:33:32 5

  • 4
    curl -Is slashdot.org | sed -n '5p' | sed 's/^X-//'
    noqqe · 2010-09-26 12:09:35 3
  • Each file in the current folder is uploaded to imageshack.us If the folder contains other filetypes change: for files in * to: for files in *.jpg (to upload ONLY .jpg files) Additionally you can try (results may vary): for files in *.jpg *.png The output URL is encased with BB image tags for use in a forum. Show Sample Output


    4
    imageshack() { for files in *; do curl -H Expect: -F fileupload="@$files" -F xml=yes -# "http://www.imageshack.us/index.php" | grep image_link | sed -e 's/<image_link>/[IMG]/g' -e 's/<\/image_link>/[\/IMG]/g'; done; }
    operatinghazard · 2010-10-01 06:50:04 6
  • ‹ First  < 5 6 7 8 9 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Replace all in last command

Find the package that installed a command

GREP a PDF file.
PDF files are simultaneously wonderful and heinous. They are wonderful in being ubiquitous and mostly being cross platform. They are heinous in being very difficult to work with from the command line, search, grep, use only the text inside the PDF, or use outside of proprietary products. xpdf is a wonderful set of PDF tools. It is on many linux distros and can be installed on OS X. While primarily an open PDF viewer for X, xpdf has the tool "pdftotext" that can extract formated or unformatted text from inside a PDF that has text. This text stream can then be further processed by grep or other tool. The '-' after the file name directs output to stdout rather than to a text file the same name as the PDF. Make sure you use version 3.02 of pdftotext or later; earlier versions clipped lines. The lines extracted from a PDF without the "-layout" option are very long. More paragraphs. Use just to test that a pattern exists in the file. With "-layout" the output resembles the lines, but it is not perfect. xpdf is available open source at http://www.foolabs.com/xpdf/

Determine whether a CPU has 64 bit capability or not

Prints any IP out of a file

Get first Git commit hash
git log --format=%H | tail -1 doesn't work anymore

Find usb device in realtime
Using this command you can track a moment when usb device was attached.

Change the case of a single word in vim
In edit mode, toggle the case of a single word under the cursor in vim.

Better git diff, word delimited and colorized
I've been using colordiff for years. wdiff is the new fav, except its colors. Word delimited diffs are more interleaved, easing the chore of associating big blocks of changes.

Change the homepage of Firefox
Pros: Works in all Windows computers, most updated and compatible command. Cons: 3 liner Replace fcisolutions.com with your site name.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: