Commands by lrvick (4)

  • Bash process substitution which curls the website 'hashbang.sh' and executes the shell script embedded in the page. This is obviously not the most secure way to run something like this, and we will scold you if you try. The smarter way would be: Download locally over SSL > curl https://hashbang.sh >> hashbang.sh Verify integrty with GPG (If available) > gpg --recv-keys 0xD2C4C74D8FAA96F5 > gpg --verify hashbang.sh Inspect source code > less hashbang.sh Run > chmod +x hashbang.sh > ./hashbang.sh


    5
    sh <(curl hashbang.sh)
    lrvick · 2015-03-15 21:02:01 2
  • Requires aria2c but could just as easily wget or anything else. A great way to build up a nice font collection for Gimp without having to waste a lot of time. :-) Show Sample Output


    10
    d="www.dafont.com/alpha.php?";for c in {a..z}; do l=`curl -s "${d}lettre=${c}"|sed -n 's/.*ge=\([0-9]\{2\}\).*/\1/p'`;for((p=1;p<=l;p++));do for u in `curl -s "${d}page=${p}&lettre=${c}"|egrep -o "http\S*.com/dl/\?f=\w*"`;do aria2c "${u}";done;done;done
    lrvick · 2010-05-18 07:38:54 0
  • Ever compress a file for the web by replacing all newline characters with nothing so it makes one nice big blob? It is a great idea, however what about when you want to edit that file? ...Serious pain in the butt. I ran into this today in that my only copy of a CSS file was "compressed" with no newlines. I whipped this up and it converted back into nice human readable CSS :-) It could be nicer, but it does the job.


    1
    cat somefile.css | awk '{gsub(/{|}|;/,"&\n"); print}' >> uncompressed.css
    lrvick · 2009-06-02 15:51:51 5
  • This is the result of a several week venture without X. I found myself totally happy without X (and by extension without flash) and was able to do just about anything but watch YouTube videos... so this a the solution I came up with for that. I am sure this can be done better but this does indeed work... and tends to work far better than YouTube's ghetto proprietary flash player ;-) Replace $i with any YouTube ID you want and this will scrape the site for the _real_ URL to the full quality .FLV file on Youtube's server and will then will hand that over to mplayer (or vlc or whatever you want) to be streamed. In some browsers you can replace $i with just a % or put this in a shell script so all YouTube IDs can be handed directly off to your media player of choice for true streaming without the need for Flash or a downloader like clive. (I do however fully recommend clive if you wish to archive videos instead of streaming them) If any interest is shown I would be more than happy to provide similar commands for other sites. Most streaming flash players use similar logic to YouTube. Edit: 05/03/2011 - Updated line to work with current YouTube. It could be a lot prettier but I will probably follow up with another update when I figure out how to get rid of that pesky Grep. Sed should take that syntax... but it doesn't. Original (no longer working) command: mplayer -fs $(echo "http://youtube.com/get_video.php?$(curl -s $youtube_url | sed -n "/watch_fullscreen/s;.*\(video_id.\+\)&title.*;\1;p")") Show Sample Output


    57
    i="8uyxVmdaJ-w";mplayer -fs $(curl -s "http://www.youtube.com/get_video_info?&video_id=$i" | echo -e $(sed 's/%/\\x/g;s/.*\(v[0-9]\.lscache.*\)/http:\/\/\1/g') | grep -oP '^[^|,]*')
    lrvick · 2009-03-09 03:57:44 15

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Make sure a script is run in a terminal.
Exit with error if script is not run in a terminal

Create a single PDF from multiple images with ImageMagick
Given some images (jpg or other supported formats) in input, you obtain a single PDF file with an image for every page.

Eliminate dead symlinks interactively in /usr/ recursevely
Its not mine... I get from textlive migration in gentoo : http://www.gentoo.org/proj/en/tex/texlive-migration-guide.xml

Find usb device in realtime
Using this command you can track a moment when usb device was attached.

export iPad, iPhone App list to txt file

List manually installed packages (excluding Essentials)

Download entire commandlinefu archive to single file
'jot' does not come with most *nix distros, so we need to use seq to make it work. This version tested good on Fedora 11.

Does a traceroute. Lookup and display the network or AS names and AS numbers.
From the man page. lft ? display the route packets take to a network host/socket using one of several layer-4 protocols and methods; optionally show heuristic network information in transitu -A Enable lookup and display of of AS (autonomous system) numbers (e.g., [1]). This option queries one of several whois servers (see options 'C' and 'r') in order to ascertain the origin ASN of the IP address in question. By default, LFT uses the pWhoIs service whose ASN data tends to be more accurate and more timely than using the RADB as it is derived from the Internet's global routing table. -N Enable lookup and display of network or AS names (e.g., [GNTY-NETBLK-4]). This option queries Prefix WhoIs, RIPE NCC, or the RADB (as requested). In the case of Prefix WhoIs or RADB, the network name is displayed. In the case of RIPE NCC, the AS name is displayed.

Delay execution until load average falls under 1.5
If shell escaping of the command is problematic, you can write the command to a file first: $ batch

Find all files currently open in Vim and/or gVim
Catches .swp, .swo, .swn, etc. If you have access to lsof, it'll give you more compressed output and show you the associated terminals (e.g., pts/5, which you could then use 'w' to figure out where it's originating from): lsof | grep '\.sw.$' If you have swp files turned off, you can do something like: ps x | grep '[g,v]im', but it won't tell you about files open in buffers, via :e [file].


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: