Commands using sed (1,295)

  • Usage: translate <phrase> <source-language> <output-language> Example: translate hello en es See this for a list of language codes: http://en.wikipedia.org/wiki/List_of_ISO_639-1_codes Show Sample Output


    65
    translate(){ wget -qO- "http://ajax.googleapis.com/ajax/services/language/translate?v=1.0&q=$1&langpair=$2|${3:-en}" | sed 's/.*"translatedText":"\([^"]*\)".*}/\1\n/'; }
    matthewbauer · 2010-03-08 03:15:48 9
  • Prints a graphical directory tree from your current directory Show Sample Output


    59
    ls -R | grep ":$" | sed -e 's/:$//' -e 's/[^-][^\/]*\//--/g' -e 's/^/ /' -e 's/-/|/'
    unixmonkey842 · 2009-02-15 20:43:21 3
  • This is the result of a several week venture without X. I found myself totally happy without X (and by extension without flash) and was able to do just about anything but watch YouTube videos... so this a the solution I came up with for that. I am sure this can be done better but this does indeed work... and tends to work far better than YouTube's ghetto proprietary flash player ;-) Replace $i with any YouTube ID you want and this will scrape the site for the _real_ URL to the full quality .FLV file on Youtube's server and will then will hand that over to mplayer (or vlc or whatever you want) to be streamed. In some browsers you can replace $i with just a % or put this in a shell script so all YouTube IDs can be handed directly off to your media player of choice for true streaming without the need for Flash or a downloader like clive. (I do however fully recommend clive if you wish to archive videos instead of streaming them) If any interest is shown I would be more than happy to provide similar commands for other sites. Most streaming flash players use similar logic to YouTube. Edit: 05/03/2011 - Updated line to work with current YouTube. It could be a lot prettier but I will probably follow up with another update when I figure out how to get rid of that pesky Grep. Sed should take that syntax... but it doesn't. Original (no longer working) command: mplayer -fs $(echo "http://youtube.com/get_video.php?$(curl -s $youtube_url | sed -n "/watch_fullscreen/s;.*\(video_id.\+\)&title.*;\1;p")") Show Sample Output


    57
    i="8uyxVmdaJ-w";mplayer -fs $(curl -s "http://www.youtube.com/get_video_info?&video_id=$i" | echo -e $(sed 's/%/\\x/g;s/.*\(v[0-9]\.lscache.*\)/http:\/\/\1/g') | grep -oP '^[^|,]*')
    lrvick · 2009-03-09 03:57:44 15
  • Checks the Gmail ATOM feed for your account, parses it and outputs a list of unread messages. For some reason sed gets stuck on OS X, so here's a Perl version for the Mac: curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '<entry>' '{for (i=2; i<=NF; i++) {print $i}}' | perl -pe 's/^<title>(.*)<\/title>.*<name>(.*)<\/name>.*$/$2 - $1/' If you want to see the name of the last person, who added a message to the conversation, change the greediness of the operators like this: curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '<entry>' '{for (i=2; i<=NF; i++) {print $i}}' | perl -pe 's/^<title>(.*)<\/title>.*?<name>(.*?)<\/name>.*$/$2 - $1/' Show Sample Output


    46
    curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '<entry>' '{for (i=2; i<=NF; i++) {print $i}}' | sed -n "s/<title>\(.*\)<\/title.*name>\(.*\)<\/name>.*/\2 - \1/p"
    postrational · 2009-09-07 21:56:40 6
  • You can get one specific line during any procedure. Very interesting to be used when you know what line you want. Show Sample Output


    41
    sed -n 5p <file>
    Waldirio · 2009-10-15 11:00:48 3
  • Nothing special required, just wget, sed & tr! Show Sample Output


    36
    wget http://www.youtube.com/watch?v=dQw4w9WgXcQ -qO- | sed -n "/fmt_url_map/{s/[\'\"\|]/\n/g;p}" | sed -n '/^fmt_url_map/,/videoplayback/p' | sed -e :a -e '$q;N;5,$D;ba' | tr -d '\n' | sed -e 's/\(.*\),\(.\)\{1,3\}/\1/' | wget -i - -O surprise.flv
    Eno · 2011-01-25 04:19:06 10

  • 30
    sed -i 8d ~/.ssh/known_hosts
    prayer · 2010-07-10 14:22:34 1
  • (relies on 'imagemagick') This command will convert all .pdf files in a directory into a 800px (wide or height, whichever is smaller) image (with the aspect ratio kept) .jpg. If the file is named 'example1.pdf' it will be named 'example1.jpg' when it is complete. This is a VERY worthwhile command! People pay hundreds of dollars for this in the Windows world. My .jpg files average between 150kB to 300kB, but your's may differ. Show Sample Output


    29
    for file in `ls *.pdf`; do convert -verbose -colorspace RGB -resize 800 -interlace none -density 300 -quality 80 $file `echo $file | sed 's/\.pdf$/\.jpg/'`; done
    brettalton · 2009-02-15 23:27:43 4
  • recursively traverse the directory structure from . down, look for string "oldstring" in all files, and replace it with "newstring", wherever found also: grep -rl oldstring . |xargs perl -pi~ -e 's/oldstring/newstring'


    25
    $ grep -rl oldstring . |xargs sed -i -e 's/oldstring/newstring/'
    netfortius · 2009-03-03 20:10:19 9

  • 24
    sed -r "s/\x1B\[([0-9]{1,2}(;[0-9]{1,2})?)?[m|K]//g"
    Cowboy · 2009-09-23 12:06:33 6
  • Sed stops parsing at the match and so is much more effecient than piping head into tail or similar. Grab a line range using sed '999995,1000005!d' < my_massive_file


    21
    sed '1000000!d;q' < massive-log-file.log
    root · 2009-01-26 11:50:00 2
  • If you have used bash for any scripting, you've used the date command alot. It's perfect for using as a way to create filename's dynamically within aliases,functions, and commands like below.. This is actually an update to my first alias, since a few commenters (below) had good observations on what was wrong with my first command. # creating a date-based ssh-key for askapache.github.com ssh-keygen -f ~/.ssh/`date +git-$USER@$HOSTNAME-%m-%d-%g` -C 'webmaster@askapache.com' # /home/gpl/.ssh/git-gplnet@askapache.github.com-04-22-10 # create a tar+gzip backup of the current directory tar -czf $(date +$HOME/.backups/%m-%d-%g-%R-`sed -u 's/\//#/g' <<< $PWD`.tgz) . # tar -czf /home/gpl/.backups/04-22-10-01:13-#home#gpl#.rr#src.tgz . I personally find myself having to reference date --help quite a bit as a result. So this nice alias saves me a lot of time. This is one bdash mofo. Works in sh and bash (posix), but will likely need to be changed for other shells due to the parameter substitution going on.. Just extend the sed command, I prefer sed to pretty much everything anyways.. but it's always preferable to put in the extra effort to go for as much builtin use as you can. Otherwise it's not a top one-liner, it's a lazyboy recliner. Here's the old version: alias dateh='date --help|sed "/^ *%%/,/^ *%Z/!d;s/ \+/ /g"|while read l;do date "+ %${l/% */}_${l/% */}_${l#* }";done|column -s_ -t' This trick from my [ http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html bash_profile ] Show Sample Output


    21
    alias dateh='date --help|sed -n "/^ *%%/,/^ *%Z/p"|while read l;do F=${l/% */}; date +%$F:"|'"'"'${F//%n/ }'"'"'|${l#* }";done|sed "s/\ *|\ */|/g" |column -s "|" -t'
    AskApache · 2010-04-21 01:22:18 5
  • Use sed to color the output of a human-readable dmesg output


    21
    dmesg -T|sed -e 's|\(^.*'`date +%Y`']\)\(.*\)|\x1b[0;34m\1\x1b[0m - \2|g'
    jlaunay · 2012-07-31 22:21:07 13
  • Delete a range of line


    18
    sed -i <file> -re '<start>,<end>d'
    Zulu · 2012-02-02 17:59:18 1
  • Original author unknown (I believe off of a wifi hacking forum). Used in conjuction with ifconfig and cron.. can be handy (especially spoofing AP's) Show Sample Output


    17
    MAC=`(date; cat /proc/interrupts) | md5sum | sed -r 's/^(.{10}).*$/\1/; s/([0-9a-f]{2})/\1:/g; s/:$//;'`
    vaporub · 2009-02-16 07:09:43 2

  • 15
    sed 's/\o0/\n/g' /proc/INSERT_PID_HERE/environ
    unixmonkey3402 · 2009-04-23 22:58:57 0
  • This command find all files in the current dir and subdirs, and replace all occurances of "oldstring" in every file with "newstring".


    15
    find . -type f -exec sed -i s/oldstring/newstring/g {} +
    SlimG · 2009-12-09 00:46:13 6
  • Use the following variation for FreeBSD: openssl rand 6 | xxd -p | sed 's/\(..\)/\1:/g; s/:$//'


    15
    openssl rand -hex 6 | sed 's/\(..\)/\1:/g; s/.$//'
    putnamhill · 2010-09-23 02:31:12 0

  • 15
    check(){ curl -sI $1 | sed -n 's/Location: *//p';}
    putnamhill · 2010-09-30 12:29:02 1

  • 13
    sed -i 8d ~/.ssh/known_hosts
    prayer · 2009-03-03 01:13:28 3
  • echo "http%3A%2F%2Fwww.google.com" | sed -e's/%\([0-9A-F][0-9A-F]\)/\\\\\x\1/g' | xargs echo -e http://www.google.com Works under bash on linux. just alter the '-e' option to its corresponding equivalence in your system to execute escape characters correctly.


    13
    sed -e's/%\([0-9A-F][0-9A-F]\)/\\\\\x\1/g' | xargs echo -e
    mohan43u · 2009-05-25 05:37:44 3
  • This one uses dictionary.com


    13
    pronounce(){ wget -qO- $(wget -qO- "http://dictionary.reference.com/browse/$@" | grep 'soundUrl' | head -n 1 | sed 's|.*soundUrl=\([^&]*\)&.*|\1|' | sed 's/%3A/:/g;s/%2F/\//g') | mpg123 -; }
    matthewbauer · 2010-03-13 04:23:56 4
  • Though without infinite time and knowledge of how the site will be designed in the future this may stop working, it still will serve as a simple straight forward starting point. This uses the observation that the only item marked as strong on the page is the single logical line that includes the italicized fact. If future revisions of the page show failure, or intermittent failure, one may simply alter the above to read. wget randomfunfacts.com -O - 2>/dev/null | tee lastfact | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;" The file lastfact, can then be examined whenever the command fails.


    13
    wget randomfunfacts.com -O - 2>/dev/null | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;"
    tali713 · 2010-03-30 23:49:30 1
  • Check if a site is down with downforeveryoneorjustme.com


    13
    down4me() { wget -qO - "http://www.downforeveryoneorjustme.com/$1" | sed '/just you/!d;s/<[^>]*>//g' ; }
    vando · 2011-03-11 14:38:38 0
  • i'm using gawk, you may get varying mileage with other varieties. You might want to change the / after du to say, /home/ or /var or something, otherwise this command might take quite some time to complete. Sorry it's so obsfucated, I had to turn a script into a one-liner under 255 characters for commandlinefu. Note: the bar ratio is relative, so the highest ratio of the total disk, "anchors" the rest of the graph. EDIT: the math was slightly wrong, fixed it. Also, made it compliant with older versions of df. Show Sample Output


    13
    t=$(df|awk 'NR!=1{sum+=$2}END{print sum}');sudo du / --max-depth=1|sed '$d'|sort -rn -k1 | awk -v t=$t 'OFMT="%d" {M=64; for (a=0;a<$1;a++){if (a>c){c=a}}br=a/c;b=M*br;for(x=0;x<b;x++){printf "\033[1;31m" "|" "\033[0m"}print " "$2" "(a/t*100)"% total"}'
    kevinquinnyo · 2011-12-01 01:21:11 9
  •  1 2 3 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Reboot without being root
For more, See: https://github.com/noureddin/bash-scripts/blob/master/user_scripts/userpower

List all information about all files (in current dir)
This is a funny usage of the traditional command ls. It could be basically simplified as: $ ls -a -l Duplicating arguments is permitted: $ ls -a -l -l And this markup could be shortened as: $ ls -al Extra note: To view filesizes like a pro, pray for your God: $ ls -allah

Install pip with Proxy
Installs pip packages defining a proxy

backup and remove files with access time older than 5 days.
create an archive of files with access time older than 5 days, and remove original files.

List all execs in $PATH, usefull for grepping the resulting list
While it seems (to me at least) a little counter-intuitive to filter on name first, this requires less work for find, as it allows it to immediately discount any files that do not match the name directly from the directory listing on disk. Querying against file attributes requires reading the file attributes, which is performed for all files matching any name based predicates.

return external ip
curl inet-ip.info -> 113.33.232.62\n curl inet-ip.info/ip -> 113.33.232.62 curl inet-ip.info/json -> JSON print curl inet-ip.info/json/indent -> JSON pretty print curl inet-ip.info/yaml -> YAML format curl inet-ip.info/toml -> TOML format http://inet-ip.info

Right-align text in console using pipe like ( command | right )

how many pages will my text files print on?
This gives a very rough estimate of how many pages your text files will print on. Assumes 60 lines per page, and does not take long lines into account.

Print permanent subtitles on a video (international edition :) )
If it's Hebrew [most probably all RTL languages. Comments?], add -flip-hebrew and -noflip-hebrew-commas to the mplayer switches: $ transcode -i myvideo.avi -x mplayer="-utf8 -flip-hebrew -noflip-hebrew-commas -sub myvideo.srt" -o myvideo_subtitled.avi -y xvid

Find all files containing a word
This command will find all files recursively containing the phrase entered, represented here by "searchphrase". This particular command searches in all php files, but you could change that to just be html files or just log files etc.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: