It uses curl --url-encode to encode long URLs *properly* and parses XML with xmlstarlet. If ~/.bitlyrc were to contain login:apikey then a script could read the apiKey and login from ~/.bitlyrc like so: login=$(sed 's/:.*//' < $HOME/.bitlyrc) apikey=$(sed 's/[^:]*://' < $HOME/.bitlyrc) curl -s --data-urlencode 'longUrl='$1 --data-urlencode 'login='$login --data-urlencode 'apiKey='$apikey 'http://api.bit.ly/shorten?version=2.0.1&format=xml' | xmlstarlet sel -T -t -m "//shortUrl" -v "." | line Show Sample Output
If you are behind a restrictive proxy/firewall that blocks port 22 connections but allows SSL on 443 (like most do) then you can still push changes to your github repository. Your .ssh/config file should contain: Host * ForwardX11 no TCPKeepAlive yes ProtocolKeepAlives 30 ProxyCommand /usr/local/bin/proxytunnel -v -p -d %h:443 Host User git Hostname ssh.github.com ChallengeResponseAuthentication yes IdentityFile ~/.ssh/id_rsa IdentitiesOnly yes Basically proxytunnel "tunnels" your ssh connection through port 443. You could also use corkscrew or some other tunneling program that is available in your distro's repository. PS: I generally use "github.com" as the SSH-HOST so that urls of the kind git@github.com:USER/REPO.git work transparently :) You
I know this has been beaten to death but finding video files using mime types and printing the "hours of video" for each directory is (IMHO) easier to parse than just a single total. Output is in minutes. Among the other niceties is that it omits printing of non-video files/folders PS: Barely managed to fit it within the 255 character limit :D Show Sample Output
Uses mime-type of files rather than relying on file extensions to find files of a certain type.
This can obviously be extended to finding files of any other type as well.. like plain text files, audio, etc..
In reference to displaying the total hours of video (which was earlier posted in command line fu, but relied on the user having to supply all possible video file formats) we can now do better:
find ./ -type f -print0 | xargs -0 file -iNf - | grep video | cut -d: -f1 | xargs -d'\n' /usr/share/doc/mplayer/examples/midentify | grep ID_LENGTH | awk -F "=" '{sum += $2} END {print sum/60/60; print "hours"}'
enlubtsqyuse
cat /tmp/out
subsequently
Show Sample Output
Does that count as a win for bzip2? Show Sample Output
Are there any creative pieces of music that can be created using beep and the shell? I'd love to hear it!
http://en.wikipedia.org/wiki/Year_2038_problem
Some other notable dates that have passed:
date -d@1234567890
date -d@1000000000
Show Sample Output
Extremely useful to maintain backups if you're using Dropbox. This mirrors the entire directory structure and places symlinks in each to the original file. Instead of copying over the data again to the ~/Dropbox folder creating a symbolic link tree is much more sensible in terms of space usage.
This has to be supplemented by another script that removes dead symlinks in the Dropbox folder which point to files that have been moved/removed.
find -L ./ -type l -delete
And then removing empty directories
find ./ -type d -exec rmdir 2>/dev/null {} \;
**Actually after some finding I found lndir which creates symbolic trees but it wasn't in the Arch repos so.. ;)
Might be more useful if you were able to print it in Days HH:MM:SS format as:
perl -e '@p=gmtime(234234);printf("%d Days %02d:%02d:%02ds\n",@p[7,2,1,0]);'
But I'm not exactly sure how to replace the 234234 with the output of the countdown time. (Having some problems with nested quoting/command substitution). Help would be appreciated :)
Python comments begin with a #. Modify to suit other languages. Other uses: Instead of m0 use m$ for end of file or d for deleting all comments.
Extensible to other ugly extensions like *.JPG, *.Jpg etc.. Leave out the last pipe to sh to perform a dry run.
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: