Commands by colemar (3)

  • Silent: anywait () { for pid in "$@"; do while kill -0 "$pid" >/dev/null 2>&1; do sleep 0.5; done; done } Prints dots: anywaitd () { for pid in "$@"; do while kill -0 "$pid" >/dev/null 2>&1; do sleep 0.5; echo -n '.'; done; done } Prints process ids: anywaitp () { for pid in "$@"; do while kill -0 "$pid" >/dev/null 2>&1; do sleep 0.5; echo -n $pid' '; done; echo; done } You cannot anywait for other users processes. Show Sample Output


    0
    wait 536; anywait 536; anywaitd 537; anywaitp 5562 5563 5564
    colemar · 2014-10-22 06:31:47 4
  • Define alias for convenience: alias clbin='curl -v -F "clbin=<-" https://clbin.com' Paste man page: man bash | clbin Paste image: curl -F 'clbin=@filename.jpg' https://clbin.com


    0
    <command> | curl -F 'clbin=<-' https://clbin.com
    colemar · 2014-10-21 13:02:18 11
  • Mirror a remote directory using some tricks to maximize network speed. lftp:: coolest file transfer tool ever -u: username and password (pwd is merely a placeholder if you have ~/.ssh/id_rsa) -e: execute internal lftp commands set sftp:connect-program: use some specific command instead of plain ssh ssh:: -a -x -T: disable useless things -c arcfour: use the most efficient cipher specification -o Compression=no: disable compression to save CPU mirror: copy remote dir subtree to local dir -v: be verbose (cool progress bar and speed meter, one for each file in parallel) -c: continue interrupted file transfers if possible --loop: repeat mirror until no differences found --use-pget-n=3: transfer each file with 3 independent parallel TCP connections -P 2: transfer 2 files in parallel (totalling 6 TCP connections) sftp://remotehost:22: use sftp protocol on port 22 (you can give any other port if appropriate) You can play with values for --use-pget-n and/or -P to achieve maximum speed depending on the particular network. If the files are compressible removing "-o Compression=n" can be beneficial. Better create an alias for the command. Show Sample Output


    4
    lftp -u user,pwd -e "set sftp:connect-program 'ssh -a -x -T -c arcfour -o Compression=no'; mirror -v -c --loop --use-pget-n=3 -P 2 /remote/dir/ /local/dir/; quit" sftp://remotehost:22
    colemar · 2014-10-17 00:29:34 4

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

check open ports without netstat or lsof

Alias TAIL for automatic smart output
Run the alias command, then issue $ps aux | tail and resize your terminal window (putty/console/hyperterm/xterm/etc) then issue the same command and you'll understand. $ ${LINES:-`tput lines 2>/dev/null||echo -n 12`} Insructs the shell that if LINES is not set or null to use the output from `tput lines` ( ncurses based terminal access ) to get the number of lines in your terminal. But furthermore, in case that doesn't work either, it will default to using the default of 80. The default for TAIL is to output the last 10 lines, this alias changes the default to output the last x lines instead, where x is the number of lines currently displayed on your terminal - 7. The -7 is there so that the top line displayed is the command you ran that used TAIL, ie the prompt. Depending on whether your PS1 and/or PROMPT_COMMAND output more than 1 line (mine is 3) you will want to increase from -2. So with my prompt being the following, I need -7, or - 5 if I only want to display the commandline at the top. ( http://www.askapache.com/linux/bash-power-prompt.html ) 275MB/748MB [7995:7993 - 0:186] 06:26:49 Thu Apr 08 [askapache@n1-backbone5:/dev/pts/0 +1] ~ $ In most shells the LINES variable is created automatically at login and updated when the terminal is resized (28 linux, 23/20 others for SIGWINCH) to contain the number of vertical lines that can fit in your terminal window. Because the alias doesn't hard-code the current LINES but relys on the $LINES variable, this is a dynamic alias that will always work on a tty device.

Create QR codes from a URL.
QR codes are those funny square 2d bar codes that everyone seems to be pointing their smart phones at. Try the following... $ qrurl http://xkcd.com Then open qr.*.png in your favorite image viewer. Point your the bar code reader on your smart phone at the code, and you'll shortly be reading xkcd on your phone. URLs are not the only thing that can be encoded by QR codes... short texts (to around 2K) can be encoded this way, although this function doesn't do any URL encoding, so unless you want to do that by hand it won't be useful for that.

Show log message including which files changed for a given commit in git.

list block devices
Shows all block devices in a tree with descruptions of what they are.

camelcase to underscore
For mac users !

Expand shortened URLs
curl(1) is more portable than wget(1) across Unices, so here is an alternative doing the same thing with greater portability. This shell function uses curl(1) to show what site a shortened URL is pointing to, even if there are many nested shortened URLs. This is a great way to test whether or not the shortened URL is sending you to a malicious site, or somewhere nasty that you don't want to visit. The sample output is from: $ expandurl http://t.co/LDWqmtDM

Find the package that installed a command

Edit a google doc with vim
Google just released a new commend line tool offering all sorts of new services from the commend line. One of them is uploading a youtube video but there are plenty more google services to interact with. Download it here: http://code.google.com/p/googlecl/ Manual: http://code.google.com/p/googlecl/wiki/Manual This specific command courtesy of lifehacker:http://lifehacker.com/5568817/ Though all can be found in manual page linked above.

send DD a signal to print its progress
every 1sec sends DD the USR1 signal which causes DD to print its progress.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: