Commands by lrvick (4)

  • Bash process substitution which curls the website 'hashbang.sh' and executes the shell script embedded in the page. This is obviously not the most secure way to run something like this, and we will scold you if you try. The smarter way would be: Download locally over SSL > curl https://hashbang.sh >> hashbang.sh Verify integrty with GPG (If available) > gpg --recv-keys 0xD2C4C74D8FAA96F5 > gpg --verify hashbang.sh Inspect source code > less hashbang.sh Run > chmod +x hashbang.sh > ./hashbang.sh


    5
    sh <(curl hashbang.sh)
    lrvick · 2015-03-15 21:02:01 2
  • Requires aria2c but could just as easily wget or anything else. A great way to build up a nice font collection for Gimp without having to waste a lot of time. :-) Show Sample Output


    10
    d="www.dafont.com/alpha.php?";for c in {a..z}; do l=`curl -s "${d}lettre=${c}"|sed -n 's/.*ge=\([0-9]\{2\}\).*/\1/p'`;for((p=1;p<=l;p++));do for u in `curl -s "${d}page=${p}&lettre=${c}"|egrep -o "http\S*.com/dl/\?f=\w*"`;do aria2c "${u}";done;done;done
    lrvick · 2010-05-18 07:38:54 0
  • Ever compress a file for the web by replacing all newline characters with nothing so it makes one nice big blob? It is a great idea, however what about when you want to edit that file? ...Serious pain in the butt. I ran into this today in that my only copy of a CSS file was "compressed" with no newlines. I whipped this up and it converted back into nice human readable CSS :-) It could be nicer, but it does the job.


    1
    cat somefile.css | awk '{gsub(/{|}|;/,"&\n"); print}' >> uncompressed.css
    lrvick · 2009-06-02 15:51:51 5
  • This is the result of a several week venture without X. I found myself totally happy without X (and by extension without flash) and was able to do just about anything but watch YouTube videos... so this a the solution I came up with for that. I am sure this can be done better but this does indeed work... and tends to work far better than YouTube's ghetto proprietary flash player ;-) Replace $i with any YouTube ID you want and this will scrape the site for the _real_ URL to the full quality .FLV file on Youtube's server and will then will hand that over to mplayer (or vlc or whatever you want) to be streamed. In some browsers you can replace $i with just a % or put this in a shell script so all YouTube IDs can be handed directly off to your media player of choice for true streaming without the need for Flash or a downloader like clive. (I do however fully recommend clive if you wish to archive videos instead of streaming them) If any interest is shown I would be more than happy to provide similar commands for other sites. Most streaming flash players use similar logic to YouTube. Edit: 05/03/2011 - Updated line to work with current YouTube. It could be a lot prettier but I will probably follow up with another update when I figure out how to get rid of that pesky Grep. Sed should take that syntax... but it doesn't. Original (no longer working) command: mplayer -fs $(echo "http://youtube.com/get_video.php?$(curl -s $youtube_url | sed -n "/watch_fullscreen/s;.*\(video_id.\+\)&title.*;\1;p")") Show Sample Output


    57
    i="8uyxVmdaJ-w";mplayer -fs $(curl -s "http://www.youtube.com/get_video_info?&video_id=$i" | echo -e $(sed 's/%/\\x/g;s/.*\(v[0-9]\.lscache.*\)/http:\/\/\1/g') | grep -oP '^[^|,]*')
    lrvick · 2009-03-09 03:57:44 15

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Automatically create a rar archive
This command creates a rar archive from all files in the current folder and names the archive after the folder name.

Define words and phrases with google.
This version works on Mac (avoids grep -P, adding a sed step instead, and invokes /usr/bin/perl with full path in case you have another one installed). Still requires that you install perl module HTML::Entities ? here's how: http://www.perlmonks.org/?node_id=640489

Remove Suspend option from XFCE logoff dialog

Empty a file
For when you want to flush all content from a file without removing it (hat-tip to Marc Kilgus).

list block devices
Shows all block devices in a tree with descruptions of what they are.

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Get the canonical, absolute path given a relative and/or noncanonical path
readlink -f accepts a relative, noncanonical path and emits the corresponding canonical, absolute path.

Get the Volume labels all bitlocker volumes had before being encrypted
Get information of volume labels of bitlocker volumes, even if they are encrypted and locked (no access to filesystem, no password provided). Note that the volume labels can have spaces, but only if you name then before encryption. Renaming a bitlocker partition after being encrypted does not have the same effect as doing it before.

Random number generation within a range N, here N=10

list files recursively by size


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: