Commands using sed (1,319)


  • 4
    sed '1!G;h;$!d'
    grep · 2009-02-16 21:05:54 24
  • If run in bash, this will display all executables that are in your current $PATH Show Sample Output


    4
    ls `echo $PATH | sed 's/:/ /g'`
    archlich · 2009-03-09 19:01:41 8
  • Some commands (such as netcat) have a port option but how can you know which ports are unused? Show Sample Output


    4
    netstat -atn | awk ' /tcp/ {printf("%s\n",substr($4,index($4,":")+1,length($4) )) }' | sed -e "s/://g" | sort -rnu | awk '{array [$1] = $1} END {i=32768; again=1; while (again == 1) {if (array[i] == i) {i=i+1} else {print i; again=0}}}'
    mpb · 2009-03-27 20:38:43 8
  • url can be a working copy or url to a svn repository, revision is any valid revision number for that branch. Show Sample Output


    4
    svn log $url -r $revision -v | egrep " [RAMD] \/" | sed s/^.....//
    nitehawk · 2009-04-27 19:50:06 8
  • will show: installed linux headers, image, or modules: /^ii/!d avoiding current kernel: /'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d only application names: s/^[^ ]* [^ ]* \([^ ]*\).*/\1/ avoiding stuff without a version number: /[0-9]/!d Show Sample Output


    4
    dpkg -l 'linux-*' | sed '/^ii/!d;/'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d;s/^[^ ]* [^ ]* \([^ ]*\).*/\1/;/[0-9]/!d'
    plasticdoc · 2009-06-19 10:23:38 8
  • Just a handy way to get all the unique links from inside all the html files inside a directory. Can be handy on scripts etc. Show Sample Output


    4
    find . -name '*.html' -print0| xargs -0 -L1 cat |sed "s/[\"\<\>' \t\(\);]/\n/g" |grep "http://" |sort -u
    jamespitt · 2009-07-14 07:00:15 10
  • Limited, but useful construct to extract text embedded in XML tags. This will only work if bar is all on one line. If nobody posts an alternative for the multiline sed version, I'll figure it out later...


    4
    sed -n 's/.*<foo>\([^<]*\)<\/foo>.*/\1/p'
    recursiverse · 2009-07-23 07:59:30 3
  • Display the amount of memory used by all the httpd processes. Great in case you are being Slashdoted!


    4
    ps -o rss -C httpd | tail -n +2 | (sed 's/^/x+=/'; echo x) | bc
    ricardoarguello · 2009-07-31 15:15:08 9
  • I use this as an alias: alias authplain "printf '\!:1\0\!:1\0\!:2' | mmencode | tr -d '\n' | sed 's/^/AUTH PLAIN /'" then.. # authplain someuser@somedomain.com secretpassword AUTH PLAIN c29tZXVzZXJAc29tZWRvbWFpbi5jb20Ac29tZXVzZXJAc29tZWRvbWFpbi5jb20Ac2VjcmV0cGFzc3dvcmQ= # Show Sample Output


    4
    printf '\!:1\0\!:1\0\!:2' | mmencode | tr -d '\n' | sed 's/^/AUTH PLAIN /'
    vwal · 2009-08-04 05:04:50 4
  • Will return your internal IP address. Show Sample Output


    4
    ifconfig $devices | grep "inet addr" | sed 's/.*inet addr:\([0-9\.]*\).*/\1/g'
    matthewbauer · 2009-08-06 21:43:22 3
  • In Bash, when defining an alias, one usually loses the completion related to the function used in that alias (that completion is usually defined in /etc/bash_completion using the complete builtin). It's easy to reuse the work done for that completion in order to have smart completion for our alias. That's what is done by this command line (that's only an example but it may be very easy to reuse). Note 1 : You can use given command line in a loop "for old in apt-get apt-cache" if you want to define aliases like that for many commands. Note 2 : You can put the output of the command directly in your .bashrc file (after the ". /etc/bash_completion") to always have the alias and its completion Show Sample Output


    4
    old='apt-get'; new="su-${old}"; command="sudo ${old}"; alias "${new}=${command}"; $( complete | sed -n "s/${old}$/${new}/p" ); alias ${new}; complete -p ${new}
    Josay · 2009-08-10 00:15:05 4
  • The coolest way I've found to backup a wordpress mysql database using encryption, and using local variables created directly from the wp-config.php file so that you don't have to type them- which would allow someone sniffing your terminal or viewing your shell history to see your info. I use a variation of this for my servers that have hundreds of wordpress installs and databases by using a find command for the wp-config.php file and passing that through xargs to my function. Show Sample Output


    4
    eval $(sed -n "s/^d[^D]*DB_\([NUPH]\)[ASO].*',[^']*'\([^']*\)'.*/_\1='\2'/p" wp-config.php) && mysqldump --opt --add-drop-table -u$_U -p$_P -h$_H $_N | gpg -er AskApache >`date +%m%d%y-%H%M.$_N.sqls`
    AskApache · 2009-08-18 07:03:08 7
  • Scrape the National Weather Service Show Sample Output


    4
    weather() { lynx -dump "http://mobile.weather.gov/port_zh.php?inputstring=$*" | sed 's/^ *//;/ror has occ/q;2h;/__/!{x;s/\n.*//;x;H;d};x;s/\n/ -- /;q';}
    zude · 2009-10-17 23:47:47 4

  • 4
    geoip(){curl -s "http://www.geody.com/geoip.php?ip=${1}" | sed '/^IP:/!d;s/<[^>][^>]*>//g' ;}
    twfcc · 2009-10-19 05:48:07 17

  • 4
    mpg123 `curl -s http://blip.fm/all | sed -e 's#"#\n#g' | grep mp3$ | xargs`
    torrid · 2009-11-07 14:48:01 7

  • 4
    curl --silent search.twitter.com | sed -n '/div id=\"hot\"/,/div/p' | awk -F\> '{print $2}' | awk -F\< '{print $1}' | sed '/^$/d'
    allrightname · 2009-12-21 21:29:34 3
  • xargs deals badly with special characters (such as space, ' and "). To see the problem try this: touch important_file touch 'not important_file' ls not* | xargs rm Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.


    4
    grep -rl oldstring . | parallel sed -i -e 's/oldstring/newstring/'
    unixmonkey8046 · 2010-01-28 08:44:16 6
  • Will track your mouse and save it to a file. You can use gnuplot to graph it: gnuplot -persist <(echo "unset key;unset border;unset yzeroaxis;unset xtics;unset ytics;unset ztics;plot './mouse-tracking' with points lt 1 pt 6 ps variable")


    4
    while true; do xdotool getmouselocation | sed 's/x:\(.*\) y:\(.*\) screen:.*/\1, \2/' >> ./mouse-tracking; sleep 10; done
    matthewbauer · 2010-02-27 04:00:13 5
  • Here "^M" is NOT "SHIFT+6" and "M". Type CTRL+V+M to get it instead. Its shortest and easy. And its sed!, which is available by default in all linux flavours.. no need to install extra tools like fromdos.


    4
    sed -i 's/^M//' file
    sata · 2010-03-25 19:34:08 7
  • Ever gone to a site that has an MP3 embedded into a pesky flash player, but no download link? Well, this one-liner will yank the names of those tunes straight out of FF's cache in a nice, easy to read list. What you do with them after that is *ahem* no concern of mine. ;) Show Sample Output


    4
    for i in `ls ~/.mozilla/firefox/*/Cache`; do file $i | grep -i mpeg | awk '{print $1}' | sed s/.$//; done
    BoxingOctopus · 2010-04-11 23:14:18 7

  • 4
    sed '/^$/d'
    er0k · 2010-04-18 00:52:00 5
  • in "a.html", find all images referred as relative URI in an HTML file by "src" attribute of "img" element, replace them with "data:" URI. This useful to create single HTML file holding all images in it, as a replacement of the IE-created .mht file format. The generated HTML works fine on every other browser except IE, as well as many HTML editors like kompozer, while the .mht format only works for IE, but not for every other browser. Compare to the KDE's own single-file-web-page format "war" format, which only opens correctly on KDE, the HTML file with "data:" URI is more universally supported. The above command have many bugs. My commandline-fu is too limited to fix them: 1. it assume all URLs are relative URIs, thus works in this case: <img src="images/logo.png"/> but does not work in this case: <img src="http://www.my_web_site.com/images/logo.png" /> This may not be a bug, as full URIs perhaps should be ignored in many use cases. 2. it only work for images whoes file name suffix is one of .jpg, .gif, .png, albeit images with .jpeg suffix and those without extension names at all are legal to HTML. 3. image file name is not allowed to contain "(" even though frequently used, as in "(copy of) my car.jpg". Besides, neither single nor double quotes are allowed. 4. There is infact a big flaw in this, file names are actually used as regular expression to be replaced with base64 encoded content. This cause the script to fail in many other cases. Example: 'D:\images\logo.png', where backward slash have different meaning in regular expression. I don't know how to fix this. I don't know any command that can do full text (no regular expression) replacement the way basic editors like gedit does. 5. The original a.html are not preserved, so a user should make a copy first in case things go wrong.


    4
    grep -ioE "(url\(|src=)['\"]?[^)'\"]*" a.html | grep -ioE "[^\"'(]*.(jpg|png|gif)" | while read l ; do sed -i "s>$l>data:image/${l/[^.]*./};base64,`openssl enc -base64 -in $l| tr -d '\n'`>" a.html ; done;
    zhangweiwu · 2010-05-05 14:07:51 13
  • This will remove all installed kernels on your debian based install, except the one you're currently using. From: http://tuxtweaks.com/2009/12/remove-old-kernels-in-ubuntu/comment-page-1/#comment-1590


    4
    dpkg -l 'linux-*' | sed '/^ii/!d;/'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d;s/^[^ ]* [^ ]* \([^ ]*\).*/\1/;/[0-9]/!d' | xargs sudo apt-get -y purge
    mitzip · 2010-06-10 20:33:32 5

  • 4
    curl -Is slashdot.org | sed -n '5p' | sed 's/^X-//'
    noqqe · 2010-09-26 12:09:35 3
  • Each file in the current folder is uploaded to imageshack.us If the folder contains other filetypes change: for files in * to: for files in *.jpg (to upload ONLY .jpg files) Additionally you can try (results may vary): for files in *.jpg *.png The output URL is encased with BB image tags for use in a forum. Show Sample Output


    4
    imageshack() { for files in *; do curl -H Expect: -F fileupload="@$files" -F xml=yes -# "http://www.imageshack.us/index.php" | grep image_link | sed -e 's/<image_link>/[IMG]/g' -e 's/<\/image_link>/[\/IMG]/g'; done; }
    operatinghazard · 2010-10-01 06:50:04 6
  • ‹ First  < 5 6 7 8 9 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

See how many more processes are allowed, awesome!
There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100. $ OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 )) This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript). An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command. $ find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p' I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this: $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this. $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' If you were building a process-killer (very common for cheap hosting) this would also be handy. Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs.

Chrome sucks

burn an ISO image to writable CD
Does life get much easier? Read up about wodim for an understanding of its origins in relation to the older `cdrecord` utility

live ssh network throughput test
connects to host via ssh and displays the live transfer speed, directing all transferred data to /dev/null needs pv installed Debian: 'apt-get install pv' Fedora: 'yum install pv' (may need the 'extras' repository enabled)

backup a directory in a timestamped tar.gz
creates a tar.gz with a name like: backup20090410_173053.tar.gz of a given directory. this file was made 10 April 2009 at 5:30:53pm see date's man page to customize the timestamp format

bulk rename files with sed, one-liner
Far from my favorite, but works in sh and with an old sed that doesn't support '-E'

check open ports without netstat or lsof

Safe Delete
remove file that has sensitive info safely. Overwrites it 33 times with zeros

convert ascii string to hex
You can use "decode()" in a similar manner: $ python -c 'print "68656c6c6f".decode("hex")'

Install pip with Proxy
Installs pip packages defining a proxy


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: