Commands by totoro (6)

  • If you enable multiuser, then you can permit others to share your screen session. The following conditions apply: 1. screen must be suid root; 2. "multiuser on" must be configured in ~/.screenrc; 3. control the others user(s) access with "aclchg": # ----- from ~/.screenrc-users ----- aclchg someuser +rx "#?" #enable r/o access to "someuser" aclchg someuser -x "#,at,aclchg,acladd,acldel,quit" # don't allow these aclchg otheruser +rwx "#?" # enable r/w access to "otheruser" aclchg otheruser -x "#,at,aclchg,acladd,acldel,quit" # don't allow them to use these commands # ----- After doing this (once), you start your session with: $ screen Then, the other user can join your terminal session(s) with youruserid: $ screen -r youruserid/ Note: the trailing "/" is required. Multiple users can share the same screen simultaneously, each with independent access controlled precisely with "aclchg" in the ~/.screenrc file. I use the following setup: # ~/.screenrc-base # default screenrc on any host source $HOME/.screenrc-base source $HOME/.screenrc-$HOST source $HOME/.screenrc-users # ----- Then, the base configurations are in ~/.screenrc-base; the host-specific configurations are in ~/.screenrc-$HOST, and the user configurations are in ~/.screenrc-users. The host-specific .screenrc file might contain some host-specific screen commands; e.g.: # ~/.screen-myhost # ----- screen -t 'anywhere' /bin/tcsh screen -t 'anywhere1' /bin/tcsh # ---- The .screenrc-base contains: # ~/.screenrc-base ## I find typing ^a (Control-a) awkward. So I set the escape key to CTRL-j instead of a. escape ^Jj termcapinfo xterm* ti@:te@: autodetach on zombie kr verbose on multiuser on


    38
    % screen -r someuser/
    totoro · 2009-03-25 23:59:38 1
  • If you have lots of remote hosts sitting "behind" an ssh proxy host, then there is a special-case use of "rsynch" that allows one to easily copy directories and files across the ssh proxy host, without having to do two explicit copies: the '-e' option allows for a replacement "rsh" command. We use this option to specify an "ssh" tunnel command, with the '-A' option that causes authentication agent requests to be forwarded back to the local host. If you have ssh set up correctly, the above command can be done without any passwords being entered.


    5
    rsync -avz -e 'ssh -A sshproxy ssh' srcdir remhost:dest/path/
    totoro · 2009-03-25 21:29:07 4
  • Many Mac OS X programs, especially those in Microsoft:Office, create ASCII files with lines terminated by CRs (carriage returns). Most Unix programs expect lines separated by NLs (newlines). This little command makes it trivial to convert them. Show Sample Output


    -2
    function crtonl { perl -i -ape 's/\r\n?/\n/g;' $* ; }
    totoro · 2009-03-25 20:28:32 5
  • This little command (function) shows the CSV header fields (which are field names separated by commas) as an ordered list, clearly showing the fields and their order. Show Sample Output


    0
    function headers { head -1 $* | tr ',' '\12' | pr -t -n ; }
    totoro · 2009-03-25 20:07:47 0
  • Coming back to a project directory after sometime elsewhere? Need to know what the most recently modified files are? This little function "t" is one of my most frequent commands. I have a tcsh alias for it also: alias t 'ls -ltch \!* | head -20' Show Sample Output


    0
    function t { ls -ltch $* | head -20 ; }
    totoro · 2009-03-25 20:05:52 1
  • Ever wanted to find the most recently modified files, but couldn't remember exactly where they were in a project directory with many subdirectories? The "find" command, using a combination of "-mtime -N" and "-depth -D" can be used to find those files. If your directory structure isn't very deep, just omit the "-depth -D", but if your directory structure is very deep, then you can limit the depth of the traversal using "-depth -D", where "D" is the maximum number of directory levels to descend. Show Sample Output


    -2
    find . -type f -depth -3 -mtime -5
    totoro · 2009-03-25 19:54:06 0

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

dump database from postgresql to a file

Recursively find top 20 largest files (> 1MB) sort human readable format
from my bashrc ;)

which process is accessing the CDROM

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Play ISO/DVD-files and activate dvd-menu and mouse menu clicks.

list block devices
Shows all block devices in a tree with descruptions of what they are.

Direct auto-complete in bash
Direct auto-complete in bash

Quickly build ulimit command from current values
It is helpful to know the current limits placed on your account, and using this shortcut is a quick way to figuring out which values to change for optimization or security. Alias is: $ alias ulimith="command ulimit -a|sed 's/^.*\([a-z]\))\(.*\)$/-\1\2/;s/^/ulimit /'|tr '\n' ' ';echo" Here's the result of this command: $ ulimit -c 0 -d unlimited -e 0 -f unlimited -i 155648 -l 32 -m unlimited -n 8192 -p 8 -q 819200 -r 0 -s 10240 -t unlimited -u unlimited -v unlimited -x unlimited $ ulimit -a core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 155648 max locked memory (kbytes, -l) 32 max memory size (kbytes, -m) unlimited open files (-n) 8192 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) unlimited virtual memory (kbytes, -v) unlimited file locks (-x) unlimited

Install a local RPM package from your desktop, then use the YUM repository to resolve its dependencies.
When downloading RPMs from the Internet, you don't have to 'rpm -i' or 'rpm -U' to install the package. Especially, if the package has dependencies. If you have YUM setup to access an RPM repository, this command will install the downloaded package, then any dependencies through YUM that it relies on. Very handy on RPM-based systems.

Calculate days on which Friday the 13th occurs (inspired from the work of the user justsomeguy)
Friday is the 5th day of the week, monday is the 1st. Output may be affected by locale.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: