Commands using sed (1,319)

  • alternative using 'host' Show Sample Output


    -1
    host -t a dartsclink.com | sed 's/.*has address //'
    dartsclink · 2009-08-14 16:11:18 13
  • Do this with caution.


    -1
    for kern in $(grep "initrd " /boot/grub/grub.conf|grep -v ^#|cut -f 2- -d-|sed -e 's/\.img//g'); do mkinitrd -v -f /boot/initrd-$kern.img $kern; done
    oernii2 · 2009-08-19 09:53:29 502
  • Reverse DNS lookups, from a file with list of IP's, here the file is called lookups.txt


    -1
    sed 's/\([0-9]*\)\.\([0-9]*\)\.\([0-9]*\)\.\([0-9]*\).in-addr.arpa domain name pointer\(.*\)\./\4.\3.\2.\1\5/' \ lookups.txt
    hemanth · 2009-08-22 09:37:20 4

  • -1
    for dnsREC in $(curl -s http://www.iana.org/assignments/dns-parameters |grep -Eo ^[A-Z\.]+\ |sed 's/TYPE//'); do echo -n "$dnsREC " && dig +short $dnsREC IANA.ORG; done
    commandlinefu · 2009-09-01 03:11:18 3
  • search argument in PATH accept grep expressions without args, list all binaries found in PATH Show Sample Output


    -1
    function sepath { echo $PATH |tr ":" "\n" |sort -u |while read L ; do cd "$L" 2>/dev/null && find . \( ! -name . -prune \) \( -type f -o -type l \) 2>/dev/null |sed "s@^\./@@" |egrep -i "${*}" |sed "s@^@$L/@" ; done ; }
    mobidyc · 2009-09-11 15:03:22 5
  • Uses curl to download page of membership of US Congress. Use sed to strip HTML then perl to print a line starting with two tabs (a line with a representative) Show Sample Output


    -1
    curl "http://www.house.gov/house/MemberWWW.shtml" 2>/dev/null | sed -e :a -e 's/<[^>]*>//g;/</N;//ba' | perl -nle 's/^\t\t(.*$)/ $1/ and print;'
    drewk · 2009-09-24 23:37:36 15
  • From Hong Kong Observatory wap site ;) Show Sample Output


    -1
    wget -q -O - 'http://wap.weather.gov.hk/' | sed -r 's/<[^>]+>//g;/^UV/q' | grep -v '^$'
    twfcc · 2009-09-25 02:21:05 6
  • "get Hong Kong weather infomation from HK Observatory From Hong Kong Observatory wap site ;)" other one showed alot of blank lines for me Show Sample Output


    -1
    wget -q -O - 'http://wap.weather.gov.hk/' | sed -r 's/<[^>]+>//g;/^UV/q' | tail -n4
    dakunesu · 2009-09-25 02:36:46 3
  • You'll run into trouble if you have files w/ missing newlines at the end. I tried to use PAGER='sed \$q' git blame and even PAGER='sed \$q' git -p blame to force a newline at the end, but as soon as the output is redirected, git seems to ignore the pager.


    -1
    git ls-files | while read i; do git blame $i | sed -e 's/^[^(]*(//' -e 's/^\([^[:digit:]]*\)[[:space:]]\+[[:digit:]].*/\1/'; done | sort | uniq -ic | sort -nr
    pipping · 2009-10-25 09:40:01 4

  • -1
    getdji (){local url sedcmd;url='http://finance.yahoo.com/q?d=t&s=^DJI';sedcmd='/(DJI:.*)/,/Day.*/!d;s/^ *//g;';sedcmd="$sedcmd/Change:/s/Down / -/;/Change:/s/Up / +/;";sedcmd="$sedcmd/Open:/s//& /";lynx -dump "$url" | sed "$sedcmd"; }
    twfcc · 2009-10-26 09:00:18 5
  • Useful when you need to write e.g. an INSERT for a table with a large number of columns. This command will retrieve the column names and comma-separate them ready for INSERT INTO(...), removing the last comma.


    -1
    mysql -u <user> --password=<password> -e "SHOW COLUMNS FROM <table>" <database> | awk '{print $1}' | tr "\n" "," | sed 's/,$//g'
    maxmanders · 2009-10-29 13:42:17 3
  • This command uses the top voted "Get your external IP" command from commandlinefu.com to get your external IP address. Use this and you will always be using the communities favourite command. This is a tongue-in-cheek entry and not recommended for actual usage.


    -1
    eval $(curl -s http://www.commandlinefu.com/commands/matching/external/ZXh0ZXJuYWw=/sort-by-votes/plaintext|sed -n '/^# Get your external IP address$/{n;p;q}')
    jgc · 2009-11-04 16:58:31 6
  • same thing as the other


    -1
    ipcalc $(ifconfig eth0 | grep "inet addr:" | cut -d':' -f2,4 | sed 's/.+Bcast:/\//g') | awk '/Network/ { print $2 } '
    solarislackware · 2009-12-05 15:00:32 3

  • -1
    sed -e 's/{"url":/\n&/g' ~/.mozilla/firefox/*/sessionstore.js | cut -d\" -f4
    cfajohnson · 2009-12-10 04:31:31 4
  • There's too many options to number, My curiosity has forced me to make it using only sed. Maybe useful... or not... :-S


    -1
    sed '/./=' infile | sed '/^/N; s/\n/ /'
    glaudiston · 2009-12-10 16:24:56 6
  • Print out contents of file with line numbers. This version will print a number for every line, and separates the numbering from the line with a tab. Show Sample Output


    -1
    sed = <file> | sed 'N;s/\n/\t/'
    jgc · 2009-12-11 14:39:14 3
  • I needed to add a line to my crontab from within a script and didn't want to have to write my own temporary file. You may find you need to reload the crond after this to make the change take effect. e.g.: if [ -x /sbin/service ] then /sbin/service crond reload else CRON_PID=`ps -furoot | awk '/[^a-z]cron(d)?$/{print $2}'` if [ -n "$CRON_PID" ] then kill -HUP $CRON_PID fi fi The reason I had CRON_HOUR and CRON_MINS instead of numbers is that I wanted to generate a random time between midnight & 6AM to run the job, which I did with: CRON_HOUR=`/usr/bin/perl -e 'printf "%02d\n", int(rand(6))'` CRON_MINS=`/usr/bin/perl -e 'printf "%02d\n", int(rand(60));'`


    -1
    crontab -l | sed -e '$G;$s-$-'"$CRON_MINS $CRON_HOUR"' * * * /usr/bin/command >/dev/null 2>&1-' | crontab -
    JohnGH · 2010-01-07 11:00:05 6
  • Combines a few repetitive tasks when compiling source code. Especially useful when a hypen in a file-name breaks tab completion. 1.) wget source.tar.gz 2.) tar xzvf source.tar.gz 3.) cd source 4.) ls From there you can run ./configure, make and etc. Show Sample Output


    -1
    wtzc () { wget "$@"; foo=`echo "$@" | sed 's:.*/::'`; tar xzvf $foo; blah=`echo $foo | sed 's:,*/::'`; bar=`echo $blah | sed -e 's/\(.*\)\..*/\1/' -e 's/\(.*\)\..*/\1/'`; cd $bar; ls; }
    oshazard · 2010-01-17 11:25:47 3

  • -1
    watch -n 7 -d 'uptime | sed s/.*users?, //'
    matthewbauer · 2010-01-17 18:45:52 3

  • -1
    find . -maxdepth 1 -type f| xargs sha1sum | sed 's/^\(\w*\)\s*\(.*\)/\2 \1/' | while read LINE; do mv $LINE; done
    foremire · 2010-01-25 20:21:01 11
  • xargs deals badly with special characters (such as space, ' and "). To see the problem try this: touch important_file touch 'not important_file' ls not* | xargs rm Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.


    -1
    ls -t1 | sed 1d | parallel -X rm
    unixmonkey8046 · 2010-01-28 12:28:18 3
  • Will return temperature in Fahrenheit of a location (New York City in example). Uses a Google API. Show Sample Output


    -1
    curl -s "http://www.google.com/ig/api?weather=New%20York" | sed 's|.*<temp_f data="\([^"]*\)"/>.*|\1|'
    matthewbauer · 2010-02-08 23:06:48 5
  • Get Google Reader unread count from the command line. You'll have to define your auth token with $auth Or use: curl -s -H "Authorization: GoogleLogin auth=$(curl -sd "Email=$email&Passwd=$password&service=reader" https://www.google.com/accounts/ClientLogin | grep Auth | sed 's/Auth=\(.*\)/\1/')" "http://www.google.com/reader/api/0/unread-count?output=json" | tr '{' '\n' | sed 's/.*"count":\([0-9]*\),".*/\1/' | grep -E ^[0-9]+$ | tr '\n' '+' | sed 's/\(.*\)+/\1\n/' | bc Show Sample Output


    -1
    curl -s -H "Authorization: GoogleLogin auth=$auth" "http://www.google.com/reader/api/0/unread-count?output=json" | tr '{' '\n' | sed 's/.*"count":\([0-9]*\),".*/\1/' | grep -E ^[0-9]+$ | tr '\n' '+' | sed 's/\(.*\)+/\1\n/' | bc
    matthewbauer · 2010-02-11 00:42:57 7

  • -1
    sed 's/pattern/^[[1m&^[[0m/g'
    rmcb · 2010-02-12 14:05:34 3
  • You WILL have problems if the files have the same name. Use cases: consolidate music library and unify photos (especially if your camera separates images by dates). After running the command and verifying if there was no name issues, you can use ls -d */ | sed -e 's/^/\"/g' -e 's/$/\"/g' | xargs rm -r to remove now empty subdirectories.


    -1
    ls -d */* | sed -e 's/^/\"/g' -e 's/$/\"/g' | xargs mv -t $(pwd)
    leovailati · 2010-03-01 23:43:26 6
  • ‹ First  < 42 43 44 45 46 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Summary of disk usage, excluding other filesystems, summarised and sorted by size
This command is useful for finding out which directories below the current location use the most space. It is summarised by directory and excludes mounted filesystems. Finally it is sorted by size.

list block devices
Shows all block devices in a tree with descruptions of what they are.

exec chmod to subfiles
Using `-exec cmd {} +` causes find to build the command using all matching filenames before execution, rather than once per file.

Create user add lines from partial passwd file
Handy if you are installing a new server or recovering an old one and you have a passwd file with the accounts you want to add to the server. If you edit the file so that only the accounts that you want to add are left this line will spit out the correct useradd lines. The uid, gecos and shell will be preserved.

list block devices
Shows all block devices in a tree with descruptions of what they are.

Find usb device in realtime
Using this command you can track a moment when usb device was attached.

Fill up disk space (for testing)
Put into some file. No special purpouse, just for fun...

find out how many days since given date
Exactly the same number of characters, exactly the same results, but with bc

Functions to display, save and restore $IFS
You can display, save and restore the value of $IFS using conventional Bash commands, but these functions, which you can add to your ~/.bashrc file make it really easy. To display $IFS use the function ifs shown above. In the sample output, you can see that it displays the characters and their hexadecimal equivalent. This function saves it in a variable called $saveIFS: $ sifs () { saveIFS=$IFS; } Use this function to restore it $ rifs () { IFS=$saveIFS; } Add this line in your ~/.bashrc file to save a readonly copy of $IFS: $ declare -r roIFS=$IFS Use this function to restore that one to $IFS $ rrifs () { IFS=$roIFS; }

Take a screenshot of the window the user clicks on and name the file the same as the window title
In general, this is actually not better than the "scrot -d4" command I'm listing it as an alternative to, so please don't vote it down for that. I'm adding this command because xwd (X window dumper) comes with X11, so it is already installed on your machine, whereas scrot probably is not. I've found xwd handy on boxen that I don't want to (or am not allowed to) install packages on. NOTE: The dd junk for renaming the file is completely optional. I just did that for fun and because it's interesting that xwd embeds the window title in its metadata. I probably should have just parsed the output from file(1) instead of cutting it out with dd(1), but this was more fun and less error prone. NOTE2: Many programs don't know what to do with an xwd format image file. You can convert it to something normal using NetPBM's xwdtopnm(1) or ImageMagick's convert(1). For example, this would work: "xwd | convert fd:0 foo.jpg". Of course, if you have ImageMagick already installed, you'd probably use import(1) instead of xwd. NOTE3: Xwd files can be viewed using the X Window UnDumper: "xwud <foo.xwd". ImageMagick and The GIMP can also read .xwd files. Strangely, eog(1) cannot. NOTE4: The sleep is not strictly necessary, I put it in there so that one has time to raise the window above any others before clicking on it.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: