Commands using sed (1,319)


  • 4
    sed '1!G;h;$!d'
    grep · 2009-02-16 21:05:54 24
  • If run in bash, this will display all executables that are in your current $PATH Show Sample Output


    4
    ls `echo $PATH | sed 's/:/ /g'`
    archlich · 2009-03-09 19:01:41 8
  • Some commands (such as netcat) have a port option but how can you know which ports are unused? Show Sample Output


    4
    netstat -atn | awk ' /tcp/ {printf("%s\n",substr($4,index($4,":")+1,length($4) )) }' | sed -e "s/://g" | sort -rnu | awk '{array [$1] = $1} END {i=32768; again=1; while (again == 1) {if (array[i] == i) {i=i+1} else {print i; again=0}}}'
    mpb · 2009-03-27 20:38:43 8
  • url can be a working copy or url to a svn repository, revision is any valid revision number for that branch. Show Sample Output


    4
    svn log $url -r $revision -v | egrep " [RAMD] \/" | sed s/^.....//
    nitehawk · 2009-04-27 19:50:06 8
  • will show: installed linux headers, image, or modules: /^ii/!d avoiding current kernel: /'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d only application names: s/^[^ ]* [^ ]* \([^ ]*\).*/\1/ avoiding stuff without a version number: /[0-9]/!d Show Sample Output


    4
    dpkg -l 'linux-*' | sed '/^ii/!d;/'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d;s/^[^ ]* [^ ]* \([^ ]*\).*/\1/;/[0-9]/!d'
    plasticdoc · 2009-06-19 10:23:38 8
  • Just a handy way to get all the unique links from inside all the html files inside a directory. Can be handy on scripts etc. Show Sample Output


    4
    find . -name '*.html' -print0| xargs -0 -L1 cat |sed "s/[\"\<\>' \t\(\);]/\n/g" |grep "http://" |sort -u
    jamespitt · 2009-07-14 07:00:15 10
  • Limited, but useful construct to extract text embedded in XML tags. This will only work if bar is all on one line. If nobody posts an alternative for the multiline sed version, I'll figure it out later...


    4
    sed -n 's/.*<foo>\([^<]*\)<\/foo>.*/\1/p'
    recursiverse · 2009-07-23 07:59:30 3
  • Display the amount of memory used by all the httpd processes. Great in case you are being Slashdoted!


    4
    ps -o rss -C httpd | tail -n +2 | (sed 's/^/x+=/'; echo x) | bc
    ricardoarguello · 2009-07-31 15:15:08 9
  • I use this as an alias: alias authplain "printf '\!:1\0\!:1\0\!:2' | mmencode | tr -d '\n' | sed 's/^/AUTH PLAIN /'" then.. # authplain someuser@somedomain.com secretpassword AUTH PLAIN c29tZXVzZXJAc29tZWRvbWFpbi5jb20Ac29tZXVzZXJAc29tZWRvbWFpbi5jb20Ac2VjcmV0cGFzc3dvcmQ= # Show Sample Output


    4
    printf '\!:1\0\!:1\0\!:2' | mmencode | tr -d '\n' | sed 's/^/AUTH PLAIN /'
    vwal · 2009-08-04 05:04:50 4
  • Will return your internal IP address. Show Sample Output


    4
    ifconfig $devices | grep "inet addr" | sed 's/.*inet addr:\([0-9\.]*\).*/\1/g'
    matthewbauer · 2009-08-06 21:43:22 3
  • In Bash, when defining an alias, one usually loses the completion related to the function used in that alias (that completion is usually defined in /etc/bash_completion using the complete builtin). It's easy to reuse the work done for that completion in order to have smart completion for our alias. That's what is done by this command line (that's only an example but it may be very easy to reuse). Note 1 : You can use given command line in a loop "for old in apt-get apt-cache" if you want to define aliases like that for many commands. Note 2 : You can put the output of the command directly in your .bashrc file (after the ". /etc/bash_completion") to always have the alias and its completion Show Sample Output


    4
    old='apt-get'; new="su-${old}"; command="sudo ${old}"; alias "${new}=${command}"; $( complete | sed -n "s/${old}$/${new}/p" ); alias ${new}; complete -p ${new}
    Josay · 2009-08-10 00:15:05 4
  • The coolest way I've found to backup a wordpress mysql database using encryption, and using local variables created directly from the wp-config.php file so that you don't have to type them- which would allow someone sniffing your terminal or viewing your shell history to see your info. I use a variation of this for my servers that have hundreds of wordpress installs and databases by using a find command for the wp-config.php file and passing that through xargs to my function. Show Sample Output


    4
    eval $(sed -n "s/^d[^D]*DB_\([NUPH]\)[ASO].*',[^']*'\([^']*\)'.*/_\1='\2'/p" wp-config.php) && mysqldump --opt --add-drop-table -u$_U -p$_P -h$_H $_N | gpg -er AskApache >`date +%m%d%y-%H%M.$_N.sqls`
    AskApache · 2009-08-18 07:03:08 7
  • Scrape the National Weather Service Show Sample Output


    4
    weather() { lynx -dump "http://mobile.weather.gov/port_zh.php?inputstring=$*" | sed 's/^ *//;/ror has occ/q;2h;/__/!{x;s/\n.*//;x;H;d};x;s/\n/ -- /;q';}
    zude · 2009-10-17 23:47:47 3

  • 4
    geoip(){curl -s "http://www.geody.com/geoip.php?ip=${1}" | sed '/^IP:/!d;s/<[^>][^>]*>//g' ;}
    twfcc · 2009-10-19 05:48:07 17

  • 4
    mpg123 `curl -s http://blip.fm/all | sed -e 's#"#\n#g' | grep mp3$ | xargs`
    torrid · 2009-11-07 14:48:01 7

  • 4
    curl --silent search.twitter.com | sed -n '/div id=\"hot\"/,/div/p' | awk -F\> '{print $2}' | awk -F\< '{print $1}' | sed '/^$/d'
    allrightname · 2009-12-21 21:29:34 3
  • xargs deals badly with special characters (such as space, ' and "). To see the problem try this: touch important_file touch 'not important_file' ls not* | xargs rm Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.


    4
    grep -rl oldstring . | parallel sed -i -e 's/oldstring/newstring/'
    unixmonkey8046 · 2010-01-28 08:44:16 6
  • Will track your mouse and save it to a file. You can use gnuplot to graph it: gnuplot -persist <(echo "unset key;unset border;unset yzeroaxis;unset xtics;unset ytics;unset ztics;plot './mouse-tracking' with points lt 1 pt 6 ps variable")


    4
    while true; do xdotool getmouselocation | sed 's/x:\(.*\) y:\(.*\) screen:.*/\1, \2/' >> ./mouse-tracking; sleep 10; done
    matthewbauer · 2010-02-27 04:00:13 5
  • Here "^M" is NOT "SHIFT+6" and "M". Type CTRL+V+M to get it instead. Its shortest and easy. And its sed!, which is available by default in all linux flavours.. no need to install extra tools like fromdos.


    4
    sed -i 's/^M//' file
    sata · 2010-03-25 19:34:08 7
  • Ever gone to a site that has an MP3 embedded into a pesky flash player, but no download link? Well, this one-liner will yank the names of those tunes straight out of FF's cache in a nice, easy to read list. What you do with them after that is *ahem* no concern of mine. ;) Show Sample Output


    4
    for i in `ls ~/.mozilla/firefox/*/Cache`; do file $i | grep -i mpeg | awk '{print $1}' | sed s/.$//; done
    BoxingOctopus · 2010-04-11 23:14:18 7

  • 4
    sed '/^$/d'
    er0k · 2010-04-18 00:52:00 5
  • in "a.html", find all images referred as relative URI in an HTML file by "src" attribute of "img" element, replace them with "data:" URI. This useful to create single HTML file holding all images in it, as a replacement of the IE-created .mht file format. The generated HTML works fine on every other browser except IE, as well as many HTML editors like kompozer, while the .mht format only works for IE, but not for every other browser. Compare to the KDE's own single-file-web-page format "war" format, which only opens correctly on KDE, the HTML file with "data:" URI is more universally supported. The above command have many bugs. My commandline-fu is too limited to fix them: 1. it assume all URLs are relative URIs, thus works in this case: <img src="images/logo.png"/> but does not work in this case: <img src="http://www.my_web_site.com/images/logo.png" /> This may not be a bug, as full URIs perhaps should be ignored in many use cases. 2. it only work for images whoes file name suffix is one of .jpg, .gif, .png, albeit images with .jpeg suffix and those without extension names at all are legal to HTML. 3. image file name is not allowed to contain "(" even though frequently used, as in "(copy of) my car.jpg". Besides, neither single nor double quotes are allowed. 4. There is infact a big flaw in this, file names are actually used as regular expression to be replaced with base64 encoded content. This cause the script to fail in many other cases. Example: 'D:\images\logo.png', where backward slash have different meaning in regular expression. I don't know how to fix this. I don't know any command that can do full text (no regular expression) replacement the way basic editors like gedit does. 5. The original a.html are not preserved, so a user should make a copy first in case things go wrong.


    4
    grep -ioE "(url\(|src=)['\"]?[^)'\"]*" a.html | grep -ioE "[^\"'(]*.(jpg|png|gif)" | while read l ; do sed -i "s>$l>data:image/${l/[^.]*./};base64,`openssl enc -base64 -in $l| tr -d '\n'`>" a.html ; done;
    zhangweiwu · 2010-05-05 14:07:51 13
  • This will remove all installed kernels on your debian based install, except the one you're currently using. From: http://tuxtweaks.com/2009/12/remove-old-kernels-in-ubuntu/comment-page-1/#comment-1590


    4
    dpkg -l 'linux-*' | sed '/^ii/!d;/'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d;s/^[^ ]* [^ ]* \([^ ]*\).*/\1/;/[0-9]/!d' | xargs sudo apt-get -y purge
    mitzip · 2010-06-10 20:33:32 5

  • 4
    curl -Is slashdot.org | sed -n '5p' | sed 's/^X-//'
    noqqe · 2010-09-26 12:09:35 3
  • Each file in the current folder is uploaded to imageshack.us If the folder contains other filetypes change: for files in * to: for files in *.jpg (to upload ONLY .jpg files) Additionally you can try (results may vary): for files in *.jpg *.png The output URL is encased with BB image tags for use in a forum. Show Sample Output


    4
    imageshack() { for files in *; do curl -H Expect: -F fileupload="@$files" -F xml=yes -# "http://www.imageshack.us/index.php" | grep image_link | sed -e 's/<image_link>/[IMG]/g' -e 's/<\/image_link>/[\/IMG]/g'; done; }
    operatinghazard · 2010-10-01 06:50:04 6
  • ‹ First  < 5 6 7 8 9 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Redirect incoming traffic to SSH, from a port of your choosing
Stuck behind a restrictive firewall at work, but really jonesing to putty home to your linux box for some colossal cave? Goodness knows I was...but the firewall at work blocked all outbound connections except for ports 80 and 443. (Those were wide open for outbound connections.) So now I putty over port 443 and have my linux box redirect it to port 22 (the SSH port) before it routes it internally. So, my specific command would be: $iptables -t nat -A PREROUTING -p tcp --dport 443 -j REDIRECT --to-ports 22 Note that I use -A to append this command to the end of the chain. You could replace that with -I to insert it at the beginning (or at a specific rulenum). My linux box is running slackware, with a kernel from circa 2001. Hopefully the mechanics of iptables haven't changed since then. The command is untested under any other distros or less outdated kernels. Of course, the command should be easy enough to adapt to whatever service on your linux box you're trying to reach by changing the numbers (and possibly changing tcp to udp, or whatever). Between putty and psftp, however, I'm good to go for hours of time-killing.

Send pop-up notifications on Gnome
The title is optional. Options: -t: expire time in milliseconds. -u: urgency (low, normal, critical). -i: icon path. On Debian-based systems you may need to install the 'libnotify-bin' package. Useful to advise when a wget download or a simulation ends. Example: $ wget URL ; notify-send "Done"

Block an IP address from connecting to a server
This appends (-A) a new rule to the INPUT chain, which specifies to drop all packets from a source (-s) IP address.

notify brightness level [custom]
Brightness indicator to be used in scripts that adjust brightness [especially sys that doesn't support automatically]

get all Amazon cloud (amazonws etc) ipv6 subnets

Convert epoch date to human readable date format in a log file.

Wait for file to stop changing
This loop will finish if a file hasn't changed in the last 10 seconds. . It checks the file's modification timestamp against the clock. If 10 seconds have elapsed without any change to the file, then the loop ends. . This script will give a false positive if there's a 10 second delay between updates, e.g. due to network congestion . How does it work? 'date +%s' gives the current time in seconds 'stat -c %Y' gives the file's last modification time in seconds '$(( ))' is bash's way of doing maths '[ X -lt 10 ]' tests the result is Less Than 10 otherwise sleep for 1 second and repeat . Note: Clever as this script is, inotify is smarter.

Adhoc tar backup
Creates a quick backup with tar to a remote host over ssh.

Kill all processes that listen to ports begin with 50 (50, 50x, 50xxx,...)
Run netstat as root (via sudo) to get the ID of the process listening on the desired socket. Use awk to 1) match the entry that is the listening socket, 2) matching the exact port (bounded by leading colon and end of column), 3) remove the trailing slash and process name from the last column, and finally 4) use the system(…) command to call kill to terminate the process. Two direct commands, netstat & awk, and one forked call to kill. This does kill the specific port instead of any port that starts with 50. I consider this to be safer.

see the TIME_WAIT and ESTABLISHED nums of the network
see the TIME_WAIT and ESTABLISHED nums of the network


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: